Re: current control algorithm

I have two strings of 5 LEDs and a I need to control them with a pwm
to reach a certain intensity (I can measure the current). Now due to
lack of time , I didn't go with a PI controller (which I think would
be the right way) instead, I just calculate a hysteresis of x counts
(10bit adc) and the measured current has to be within this hysteresis.
Now I unfortunately got a high jitter on my LED current (that's
controlled by the duty cycle) - I assume this is because of a time
delay from the time i set the duty cycle until i measure the current
rise/fall. How do I best go about this?
Is there a quick method to resolve this or do i actually need to take
the time to study and implement a PI controller?

Every time you measure your current with the ADC, do something like this:

PWM += gain * (setpoint - current);

Calculations in floating point, or fixed point with sufficient bits, depending on your parameters, and clip PWM at min/max to avoid wrap around.

Experiment with different gain settings to see which one works best.