Maker Pro
Maker Pro

design of analog circuits using genetic algorithm

J

John Larkin

Hi friends,can anyone give me some insight about the design of analog
circuits using genetic algorithm?

People, mostly academics, keep trying this. As far as I know, it
doesn't work. Understanding electronics is still better than random
fiddling; the solution spaces, first for a topology and then for
values, is just too big.

John
 
J

Joel Koltner

John Larkin said:
People, mostly academics, keep trying this. As far as I know, it
doesn't work. Understanding electronics is still better than random
fiddling; the solution spaces, first for a topology and then for
values, is just too big.

I believe you were the one telling us you're personally much more than just a
giant genetic algorithm yourself though, right, John? :)

I agree with you, although I will point out for the benefit of the O.P. that
using optimizers (genetic algorithms or more traditional ones) to *tweak*
component values once you have a decent toplogy and reasonably sane starting
values is quite common and successful.
 
J

Jim Thompson

I believe you were the one telling us you're personally much more than just a
giant genetic algorithm yourself though, right, John? :)

I agree with you, although I will point out for the benefit of the O.P. that
using optimizers (genetic algorithms or more traditional ones) to *tweak*
component values once you have a decent toplogy and reasonably sane starting
values is quite common and successful.

I ran one of my BandGap designs thru one of those optimizers... killed
it... FUBAR ;-)

...Jim Thompson
 
J

John Larkin

I believe you were the one telling us you're personally much more than just a
giant genetic algorithm yourself though, right, John? :)

Somehow a few trillion neurons work better than a few thousand lines
of code. Maybe some day computers will be better than people for
circuit design, like they are now for chess. But chess has rules.
I agree with you, although I will point out for the benefit of the O.P. that
using optimizers (genetic algorithms or more traditional ones) to *tweak*
component values once you have a decent toplogy and reasonably sane starting
values is quite common and successful.

Have you done genetic optimization of circuit values? I guess you'd
first have to come up with a scoring system that defines "best" (like,
for a voltage regulator, something that includes line reg, load reg,
tc, transient response, standard value parts, cost? Then wrap around
that a simulator, then wrap around that the random value diddler and
genetic selection stuff. I can see that diverging fast. Or rather,
diverging slow. It's easy to get lost in a 17-dimensional space.

Even intelligent diddling and simulation, for something simple like a
filter, can easily become a horror.

I sometines do brute-force numerical searches for things like crystal
frequencies and divisors that satisfy some number of requirements.
That's not so much genetic as just trying a bazillion possible values
in some nested FOR loops.

John
 
J

Joel Koltner

Hi John,

John Larkin said:
Have you done genetic optimization of circuit values?

Indirectly, yes... Microwave Office has a bunch of optimizers built-in, and
one of them (the "Pointer Optimizer") rotates through a handful of other
optimizers, including a genetic algorithm-based optimizer, in order to try to
achieve better results than any single optimizer alone does. I've used it to
tweak component values in filter designs; for microwave circuits it's also
quite common to use these optimizers (ADS and Ansoft -- the other big uWave
design packages -- have them as well) to tweak matching and biasing networks
and geometries for distributed filters/couplers/etc. It's these "support"
components that get tweaked... I haven't ever seen a design where, e.g.,
someone gave the optimizer a dozen transistors to pick from.
I guess you'd
first have to come up with a scoring system that defines "best" (like,
for a voltage regulator, something that includes line reg, load reg,
tc, transient response, standard value parts, cost?

Yes... as long as the simulator can measure it, you can (attempt to) optimize
it. For standard part values the usual approach is to make an array of the
standard values available and then optimize on an index into that array. I've
never worried about cost myself since usually all the, e.g., Coilcraft mini
air-spring inductors have approximately the same price, but potentially you
could have it take that into consideration as well.
Then wrap around
that a simulator, then wrap around that the random value diddler and
genetic selection stuff. I can see that diverging fast. Or rather,
diverging slow. It's easy to get lost in a 17-dimensional space.

Absolutely... that's why they give you so many different optimizers to play
with. :) (MWO has 14...) Divergence is actually pretty uncommon; far more
often you just see stagnation while it sits and twiddles some component values
a bit but the cost metric doesn't move around much.
Even intelligent diddling and simulation, for something simple like a
filter, can easily become a horror.

Potentially, yes, but what else are you going to do (other than loosen your
specs or, say, add more sections so you're not asking as much from each one...
making your widget bigger and costlier)? Even at lower UHF frequencies there
are usually significant differences between the ideal component values for a
filter and what you actually have to use based on the parasitics of the PCB
and the finite Q of the components. There are filter synthesis where you can
specify exactly where you'd like your poles and zeroes and it'll synthesize a
circuit for you, but again you eventually have to move to real components and
it's pretty much intractable to try to directly synthesize a design taking
into account all the warts seen with real inductors. (I am impressed at the
work you see in some of the better filter design books where they take a first
cut at analytically compensating for finite inductor Q by moving the poles
around a bit, though.) Hence, you might as well get close with what you know
and can estimate, then toss in the real component models, and let the
optimizer take a whack at it.

(I'm thinking of relatively tight -- Q>20 -- bandpass filters here. Seems
like that's what I'm always being asked for...)

Good global optimization is obviously a very difficult problem, but personally
I'm amazed at times just how well these algorithms perform.

---Joel
 
F

Fred Bartoli

[email protected] a écrit :
Hi friends,can anyone give me some insight about the design of analog
circuits using genetic algorithm?

Keep meeting girls and making babies until one of them grow as a good
analog designer.

That's the best method ever.
 
V

Vladimir Vassilevsky

Hi friends,can anyone give me some insight about the design of analog
circuits using genetic algorithm?

Yea. Designing a circuit takes a lot of fucking, especially if you don't
have a clue...

VLV
 
V

Vladimir Vassilevsky

John Larkin wrote:

Have you done genetic optimization of circuit values?

There are two steps: first optimize a function then implement a network.
Do that trough the iterations.
I guess you'd
first have to come up with a scoring system that defines "best" (like,
for a voltage regulator, something that includes line reg, load reg,
tc, transient response, standard value parts, cost? Then wrap around
that a simulator, then wrap around that the random value diddler and
genetic selection stuff. I can see that diverging fast. Or rather,
diverging slow.

Actually, converging, but very slow. Too many local optima.
It's easy to get lost in a 17-dimensional space.

17 dimensions is not much. Brute force optimization gets slow with 30+
dimensions.
Even intelligent diddling and simulation, for something simple like a
filter, can easily become a horror.

It helps if you can give hints to the optimizer, so it will do the
focused search instead of the random guessing.
I sometines do brute-force numerical searches for things like crystal
frequencies and divisors that satisfy some number of requirements.
That's not so much genetic as just trying a bazillion possible values
in some nested FOR loops.

Sure. The other typical application is the search for the best
combination of the components of the standard values so the circuit
would satisfy the specs.



Vladimir Vassilevsky
DSP and Mixed Signal Design Consultant
http://www.abvolt.com
 
J

John Devereux

Vladimir Vassilevsky said:
John Larkin wrote:



There are two steps: first optimize a function then implement a
network. Do that trough the iterations.


Actually, converging, but very slow. Too many local optima.


17 dimensions is not much. Brute force optimization gets slow with 30+
dimensions.


It helps if you can give hints to the optimizer, so it will do the
focused search instead of the random guessing.


Sure. The other typical application is the search for the best
combination of the components of the standard values so the circuit
would satisfy the specs.

That's what I do for voltage dividers. We stock a certain subset of
the E12 values. I have a program which tries all combinations of 2 and
3 resistors to find the closest match. (I started out trying to
"optimise" the algorithm, then realised a brute force search would
likely take less time than typing the name of the program).
 
Have you done genetic optimization of circuit values? I guess you'd
first have to come up with a scoring system that defines "best" (like,
for a voltage regulator, something that includes line reg, load reg,
tc, transient response, standard value parts, cost? Then wrap around
that a simulator, then wrap around that the random value diddler and
genetic selection stuff. I can see that diverging fast. Or rather,
diverging slow. It's easy to get lost in a 17-dimensional space.

The problem can be considered an n-dimensional intersection problem iow?
 
That's what I do for voltage dividers. We stock a certain subset of
the E12 values. I have a program which tries all combinations of 2 and
3 resistors to find the closest match. (I started out trying to
"optimise" the algorithm, then realised a brute force search would
likely take less time than typing the name of the program).

I think I mentioned a similar problem (other problem scope) for an math
professor some time ago. And the answer I got is that there's is no solution
other than simple brute force. Anyone recall the name of this type of
problem ..?

Ofcourse if someone come up with a way to systematicly calculate this type of
problem I will be interested. :)
 
J

J.A. Legris

Hi friends,can anyone give me some insight about the design of analog
circuits using genetic algorithm?

From:
http://en.wikipedia.org/wiki/Evolvable_hardware

"The concept was pioneered by Adrian Thompson at the University of
Sussex, England, who in 1996 evolved a tone discriminator using fewer
than 40 programmable logic gates and no clock signal in a FPGA. This
is a remarkably small design for such a device and relied on
exploiting peculiarities of the hardware that engineers normally
avoid. For example, one group of gates has no logical connection to
the rest of the circuit, yet is crucial to its function."

I love that last point! There's a whole universe of possibilities out
there that is beyond the reach of engineering, which itself would
never have come about had it not been for another type of evolved
hardware, the wet stuff between our ears.

Also:

http://scholar.google.ca/scholar?q=genetic+algorithm+electronic+circuit+design&btnG=Search

Joe
 
M

Martin Brown

John Larkin said:
Somehow a few trillion neurons work better than a few thousand lines
of code. Maybe some day computers will be better than people for
circuit design, like they are now for chess. But chess has rules.

And so does circuit design. Although the intuitive creative step to
define the architecture is still well beyond modern computation
optimising components in an existing architecture is now quite
practicable even on a PC.
Have you done genetic optimization of circuit values? I guess you'd
first have to come up with a scoring system that defines "best" (like,
for a voltage regulator, something that includes line reg, load reg,
tc, transient response, standard value parts, cost? Then wrap around
that a simulator, then wrap around that the random value diddler and
genetic selection stuff. I can see that diverging fast. Or rather,
diverging slow. It's easy to get lost in a 17-dimensional space.

17 dimensions is no real challenge to modern optimisers.

No modern least squares (or 1-Norm) optimiser should ever diverge (that
was true even a couple of decades ago). What tends to happen is that
they get trapped in steep diagonal valleys or at local minima and never
find the true global optimum.

Simplex isn't too bad if you already have some idea of how big a range
of parameters you have to cover. Conjugate gradients will handle most
difficult problems fairly well given a suitable starting point and
something like simulated annealing is about as good as it gets for
global optimisation irrespective of the initial starting point. Genetic
algorithms are similar to the latter, but rely on an ensemble of
simulations with parameters that are allowed to breed according to their
success rating.

I reckon simulated annealing is easier to use than GA. YMMV.
Even intelligent diddling and simulation, for something simple like a
filter, can easily become a horror.

It should not be if you know how the free parameters are inter related.
Filter design is one case where diddling individual parameters in a 1-D
optimal search strategy will almost never get you what you want. There a
specialised codes around for optimal filter design.
I sometines do brute-force numerical searches for things like crystal
frequencies and divisors that satisfy some number of requirements.
That's not so much genetic as just trying a bazillion possible values
in some nested FOR loops.

There is probably a faster way to do that but if it is fast enough then
fine.

Regards,
 
P

Paul Burke

J.A. Legris said:
"For example, one group of gates has no logical connection to
the rest of the circuit, yet is crucial to its function."

I wonder if the resulting "design" worked in another instance of the
same FPGA>
 
J

J.A. Legris

I wonder if the resulting "design" worked in another instance of the
same FPGA>

Probably not. Maybe subsequent versions could evolve adaptive
techniques to tweak themselves into the right state.

Did you know that young eyeballs grow to assume the correct focal
length based on feedback from the retina? Evidently, reading in poor
light when you're a kid is bad for you afterall. Low light => wide
open lens aperture and short depth of field @ high abberation =>
chronically poor close focus while reading => eyeball attempts to
correct by moving the focal plane further back => permanent myopia.
 
J

John Larkin

From:
http://en.wikipedia.org/wiki/Evolvable_hardware

"The concept was pioneered by Adrian Thompson at the University of
Sussex, England, who in 1996 evolved a tone discriminator using fewer
than 40 programmable logic gates and no clock signal in a FPGA. This
is a remarkably small design for such a device and relied on
exploiting peculiarities of the hardware that engineers normally
avoid. For example, one group of gates has no logical connection to
the rest of the circuit, yet is crucial to its function."

I love that last point! There's a whole universe of possibilities out
there that is beyond the reach of engineering, which itself would
never have come about had it not been for another type of evolved
hardware, the wet stuff between our ears.

Also:

http://scholar.google.ca/scholar?q=genetic+algorithm+electronic+circuit+design&btnG=Search

Sounds like Zebulum doesm't understand electronics design *or*
evolution.

John
 
J

Joel Koltner

Paul Burke said:
I wonder if the resulting "design" worked in another instance of the same
FPGA>

Most likely not. As I recall the "design" had a horrible TempCo and was
completely unmanufacturable (i.e., yields would have been near-zero).

I guess that's what "basic research" is for, though... he didn't demonstrate
anything that couldn't be done much better using traditional methods, but he's
trying to make the argument that given enough more research maybe his approach
would be a viable design technique.
 
J

John Popelish

Joel said:
Most likely not. As I recall the "design" had a horrible TempCo and was
completely unmanufacturable (i.e., yields would have been near-zero).
(snip)

But only because those variables were not included in the
evolutionary environment. Evolution has no imagination to
predict future constraints. But it can, in some cases,
produce very robust designs, if the environment exposes the
replicating "organisms" to a wide range of stresses, without
wiping the "species" out. Evolution adapts "organisms" to
the pressures they are exposed to, while the evolution is
happening, or fails and the "species" goes extinct.

This is one reason that scientists are so worried about
global thermal runaway. It is expected to produce new
environmental conditions that will wipe out a large number
of species that evolved in circumstances that are different
from what is to come, and come too quickly for them to
evolve the necessary changes. Similar low yields.

Only bacteria and insects evolve fast enough to take
advantage of rapid changes in the environment. There is
expected to be a bloom of new diseases taking advantage of
the newly stressed food populations (humans, for example)
that will speed many extinctions. We are non robust, much
like those chips, if our environment changes rapidly.
Especially the disease environment.
 
T

Tim Williams

I think I mentioned a similar problem (other problem scope) for an math
professor some time ago. And the answer I got is that there's is no solution
other than simple brute force. Anyone recall the name of this type of
problem ..?

The famously "unsolvable" (in optimal time, that is) puzzles are
NP-complete.

With a set of resistors and rules for only one or a few in series at any
given point, you could optimize the search by excluding indices that gives
wrong values (why check ratios of 0.997 when you're looking for 0.431?) and
binary-searching the lists (since sublists of the master list are always in
order). Further optimization could be added by taking preferences, e.g.,
checking certain ratios of values that come up commonly (such as one
resistor in series with a much smaller resistor (therefore selected from a
much smaller set), adding to a value inbetween adjecent values).

Tim
 
Top