Maker Pro
Maker Pro

AREF bypass capacitance on ATMega2560?

J

Joerg

rickman said:
I don't see it for $3. Did you get a quote for your project? TI says
it is $5 to $8 at qty 1k depending on the flavor. You still need to add
Flash.

1k qty is $3.67 at Digikey:

http://www.digikey.com/product-detail/en/TMS320C5532AZHH10/296-32741-ND/2749713

$3.02 with 12wks leadtime at Arrow:

http://components.arrow.com/part/detail/51425505S8988412N7713?region=na

ROM is included.

Yes, and many FPGAs have "tons" of memory on board although not for
$3... but then this isn't a $3 part either...

It is a $3 part. See above.
Because you were looking at FPGAs that were "painfully" large I'm sure.
How many pins, 256, 400+...? 20,000 LUTs, 40,000 LUTs?

Sure, that's why I wrote "powerful". The less powerful ones have never
impressed me much but I have to admit that I haven't looked the last
five years. So, tell us, which FPGA can fully replace the above DSP for
three bucks and where could one buy it off the shelf?

So you could use FPGA instead of an MCU on most of your designs? I'm
still confused. Are you saying if you can't use a part on all of your
designs you don't want to use it on any?

No, what I am saying is the it doesn't matter much which parts I use (my
clients decide that anyhow) but that finding a programmer locally can be
important. Some projects will not really come off the ground if the
programmer isn't local. Or it can take forever.
I have done many jobs remotely and it has *never* been a problem. If
debugging is needed in the field, that is not the same thing as saying
the developer has to be in the field. You are welcome to disagree on this.

I do disagree.

I can only remember one time where I iterated through design changes in
the field and that was actually a case where I could have done it all
remotely, but I was pleasing the customer to be there. Turns out he
wasn't initializing the design right and it was hard to detect.

Well, if you are standing next to a huge roaring engine and this, that
and the other subtle disturbance has to be ironed out it would be a
major problem if the programmer is three time zones away. You don't have
to believe me but that's how it is with some of my assignments.

Just like EMC jobs on large gear can simply not be done remotely. Which
is why I bought the Signalhound analyzer (fits into carry-on).

One big advantage to FPGAs is the ability to simulate at a low level. it
is very easy to emulate the I/O in an HDL simulation. I don't know how
they do that with MCUs, but in the FPGA world I've never had a problem.

That is undoubtedly true. Plus probably less in errata headaches.

[...]

Yeah... Even if you need minor logic let's talk. I find the real
advantage to FPGAs is in the fact that I can do almost *anything* in
one. I don't need to add stuff unless it is very application specific
like the CD quality CODEC on my current production board. I could get
CD quality from an FPGA design, but it wouldn't be worth the work to
save a $3 part.

So far the next project on the books that will contain logic is 2nd half
of next year. It will need a 16-bit audio codec as well but we can use
an external chip for that, PCM29xx series et cetera. Would (probably)
even keep it external with a uC because the 16-bit ADCs on those aren't
very quiet.
You do know that it is not hard to add an ADC to an FPGA? No need for a
mux if all the inputs can be connected to separate pins. It all depends
on the resolution you need. 8 bits is easy, 16 bits - not so easy.

Mine are never less than 12-bit, often 16-bit.
That's fine, just don't judge all FPGAs by the ones you have seen. If
the only MCU you had worked with were ARM 11s using 3 Watts and running
a cell phone down in 8 hours, would you think the PIC MCUs were the same?

Well, give us all here an example of a FPGA that can fully emulate a
TSM320 and costs $3 :)
You are missing my point.

I guess then I don't see your point. Flash is available on almost
everything nowadays.
 
U

Ulf Samuelsson

rickman skrev 2013-09-06 23:12:
If your FPGA designs are expensive or power hungry, then you are doing
things you can't do in an MCU or you are not using FPGAs properly. They
don't need to use any more power than an MCU and in many cases less.
They certainly don't need to be significantly more expensive unless you
consider every dollar in your designs. At the very low end MCUs can be
under $1 and still have reasonable performance. For $1 you can't get
much in the way of programmable logic. For $3 however, you can get a
chip large enough for a CPU (with math) and room for your special logic.



Do you work in *every* town? I find if I need a skill, it doesn't
matter where the worker is. I also find FPGA design is pretty common
these days. If you are ever short on FPGA design skills, give me a shout.



Yes, the ideal chip would add a section of FPGA fabric to a conventional
MCU and vary the size just as they do the RAM and Flash. I think at
this point the problem is cultural. Most FPGA companies are all about
the big iron selling for $100 a chip min and the MCU makers don't
understand programmable logic. Xilinx and Cypress is good examples.
Zync is a very high end solution, great if it matches your problem. PSOC
is not a bad idea, but their idea of programmable logic isn't worth much.

Give me a good $10 MCU/FPGA combo and I'll design the world! In the
meantime I'll be using soft cores. They are a lot easier to program
than an MCU anyway.

That was available with the FPSLIC. AVR core with 5/10/40k gates.
Unfortunately the combination of features didn't make sense.

40 MHz AVR with only 36 kB SRAM.

It would have been much better with an ARM7 core
with an external memory bus.

Since it was implemented in a technology, and not shrinked,
it eventually become cheaper to use a softcore with
a standard FPGA in a fine geometry technology.

I came to the conclusion that an MCU company cannot compete with an FPGA
company unless the programmable logic part is only a small percentage of
the die.

It would make sense to have a 5 kgate FPGAs in a modern micro.

The Xilinx Zync and Altera SoC are nice tries, except that most
of the standard peripherals suck and it is still beyond the price
most customers I've seen are prepared to pay.


BR
Ulf Samuelsson
 
R

rickman


Not the same part pal. You're trying to pull a fast one on me? Are we
talking about the TMS320C5535 with "tons" of memory or the TMS320C5532
with *much* less memory?


ROM is not Flash.. is it? Are you thinking in terms of a mask ROM?

It is a $3 part. See above.

No, you need to pick a part number and stick with it.

Sure, that's why I wrote "powerful". The less powerful ones have never
impressed me much but I have to admit that I haven't looked the last
five years. So, tell us, which FPGA can fully replace the above DSP for
three bucks and where could one buy it off the shelf?

Lol, I don't know where you are coming from now. You don't like the
large FPGAs because they use power like they are... well, large. You
don't like the small FPGAs because they are, well... small.

When you stop using terms like "powerful" and want to actually consider
an FPGA for a design I can help you. Gut feel is nice as long as it is
based in fact.

No, what I am saying is the it doesn't matter much which parts I use (my
clients decide that anyhow) but that finding a programmer locally can be
important. Some projects will not really come off the ground if the
programmer isn't local. Or it can take forever.

That is the point I am trying to dispute. My experience has been that a
remote team is a very capable team and can do pretty much any job.
Colocation is seldom an important issue. So far your statements have
all supported that with a few exceptions. At least that is how I have
read the words. "Some projects", "For most work it doesn't
matter where the worker is", etc... YMMV

I do disagree.

Ok, but the examples you give tend to support remote work except in a
few cases. I'm just reading the words you wrote, I'm not putting them
in your mouth.

Well, if you are standing next to a huge roaring engine and this, that
and the other subtle disturbance has to be ironed out it would be a
major problem if the programmer is three time zones away. You don't have
to believe me but that's how it is with some of my assignments.

Just like EMC jobs on large gear can simply not be done remotely. Which
is why I bought the Signalhound analyzer (fits into carry-on).

I don't argue your needs. You keep saying these special jobs are a
small percentage of your work. So most assignments will work just fine
with remote support, no?

One big advantage to FPGAs is the ability to simulate at a low level. it
is very easy to emulate the I/O in an HDL simulation. I don't know how
they do that with MCUs, but in the FPGA world I've never had a problem.

That is undoubtedly true. Plus probably less in errata headaches.

[...]

Yeah... Even if you need minor logic let's talk. I find the real
advantage to FPGAs is in the fact that I can do almost *anything* in
one. I don't need to add stuff unless it is very application specific
like the CD quality CODEC on my current production board. I could get
CD quality from an FPGA design, but it wouldn't be worth the work to
save a $3 part.

So far the next project on the books that will contain logic is 2nd half
of next year. It will need a 16-bit audio codec as well but we can use
an external chip for that, PCM29xx series et cetera. Would (probably)
even keep it external with a uC because the 16-bit ADCs on those aren't
very quiet.

Not sure what the requirements are for your CODEC, but I have been using
the AKM parts with good results. Minimal or *no* programming,
configuration is done by a small number of select pins, very simple. I
have yet to find another one as small, AK4552, AK4556.

Mine are never less than 12-bit, often 16-bit.

What about delay time? Is sigma-delta ok?

Well, give us all here an example of a FPGA that can fully emulate a
TSM320 and costs $3 :)

There are a number of $3 FPGAs. I can't imagine why you would want to
emulate a DSP in an FPGA. I would rather design a project using an
FPGA. TI likely has a restriction about running their code compiled
with their tools in an FPGA softcore. So what is your project?

I guess then I don't see your point. Flash is available on almost
everything nowadays.

I was comparing FPGAs to FPGAs. Many FPGAs require external Flash, some
don't. Just like most DSPs require external flash, most MCUs don't.
 
R

rickman

rickman skrev 2013-09-06 23:12:

That was available with the FPSLIC. AVR core with 5/10/40k gates.
Unfortunately the combination of features didn't make sense.

40 MHz AVR with only 36 kB SRAM.

It would have been much better with an ARM7 core
with an external memory bus.

I don't know about the external memory bus, but the problem I had with
the FPSLIC is that I was never convinced it would last. It did however
even with a very limited market penetration. I don't know how large the
FPGA fabric was, nobody usefully measures FPGAs in gates anymore. So
I'm guessing it was actually very limited. I know for a fact it was
rather pricey.

Since it was implemented in a technology, and not shrinked,
it eventually become cheaper to use a softcore with
a standard FPGA in a fine geometry technology.

I came to the conclusion that an MCU company cannot compete with an FPGA
company unless the programmable logic part is only a small percentage of
the die.

Compete how? The FPGA companies are leaving huge segments of the market
open to anyone who has the guts to exploit it.
It would make sense to have a 5 kgate FPGAs in a modern micro.

I think 5 kgate is a tiny FPGA - what a couple of 22V10s? It would be
about the size of a UART in an MCU design. Try 2-5 kLUTs then you have
a useful device. Otherwise you are just pissing in the wind. The FPGA
fabric should be roughly the same die size as the Flash or RAM and
should scale as the memory does.

The Xilinx Zync and Altera SoC are nice tries, except that most
of the standard peripherals suck and it is still beyond the price
most customers I've seen are prepared to pay.

You need to understand their markets. Their designs are very good for
their markets which are high profit margin. Thinking like a $1 MCU
maker wont let you compete in their space. But you can compete in a $2
to $10 MCU+FPGA space. So get out there and explain it to your
management! Otherwise you lose design wins to softcores which get
cheaper every day.
 
J

Joerg

rickman said:
Not the same part pal. You're trying to pull a fast one on me? Are we
talking about the TMS320C5535 with "tons" of memory or the TMS320C5532
with *much* less memory?

It doesn't have the single access RAM but it does have 64k dual access
RAM. That's a lot of RAM in embedded.
ROM is not Flash.. is it? Are you thinking in terms of a mask ROM?

You can use the bootloader or OTP your own bootloader if you don't want
to store your programming in ROM. In most situations this is part of a
larger computerized system from where it can download its programming.

No, you need to pick a part number and stick with it.

I gave a part number. Still waiting for your $3 FPGA part number :)
Lol, I don't know where you are coming from now. You don't like the
large FPGAs because they use power like they are... well, large. You
don't like the small FPGAs because they are, well... small.

When you stop using terms like "powerful" and want to actually consider
an FPGA for a design I can help you. Gut feel is nice as long as it is
based in fact.

So then, what would be an FPGA that can fully contain the above TMS320
and what does it cost?

Which one could replace the MSP430F6733 including its ADCs and all, for
around $3?

http://www.ti.com/lit/ds/symlink/msp430f6720.pdf

[...]

I don't argue your needs. You keep saying these special jobs are a
small percentage of your work. So most assignments will work just fine
with remote support, no?

Yes, for myself I'd say it's >90%. For firmware and software support,
much lower percentage. Currently <50% of cases. Meaning I (or some other
folks) and the programmers have to literally work side-by-side.

Mostly this simply has to do with the fact that gear is too large to
ship and often also because operation by someone not used to it can
become dangerous. When pressures are several thousand psi and someone
screws up people could die.
One big advantage to FPGAs is the ability to simulate at a low level. it
is very easy to emulate the I/O in an HDL simulation. I don't know how
they do that with MCUs, but in the FPGA world I've never had a problem.

That is undoubtedly true. Plus probably less in errata headaches.

[...]

I've used schematic based tools and both VHDL and Verilog. I've
worked
with the vendor's tools and third party tools including the NeoCAD
tools
which became Xilinx tools when Xilinx bought them.

If anyone tells you they only know one brand of FPGA you are
talking to
an FPGA weenie. I find MCUs to vary a *great* deal more than FPGAs in
terms of usage. MCUs need all sorts of start up code and peripheral
drivers, clock control, etc, etc, etc. FPGAs not so much. They
mostly
have the same features and most of that can be inferred from the
HDL so
you never need to look too hard under the hood.

In short, there is a lot of FUD about FPGAs. Talk to someone who
doesn't buy into the FUD.


Yeah, maybe next time I need major logic we should talk. But better not
about politics :)

Yeah... Even if you need minor logic let's talk. I find the real
advantage to FPGAs is in the fact that I can do almost *anything* in
one. I don't need to add stuff unless it is very application specific
like the CD quality CODEC on my current production board. I could get
CD quality from an FPGA design, but it wouldn't be worth the work to
save a $3 part.

So far the next project on the books that will contain logic is 2nd half
of next year. It will need a 16-bit audio codec as well but we can use
an external chip for that, PCM29xx series et cetera. Would (probably)
even keep it external with a uC because the 16-bit ADCs on those aren't
very quiet.

Not sure what the requirements are for your CODEC, but I have been using
the AKM parts with good results. Minimal or *no* programming,
configuration is done by a small number of select pins, very simple. I
have yet to find another one as small, AK4552, AK4556.

Plus their prices are quite good.

[...]

What about delay time? Is sigma-delta ok?

As long as I can get into the tens of ksps.
There are a number of $3 FPGAs. I can't imagine why you would want to
emulate a DSP in an FPGA. I would rather design a project using an
FPGA. TI likely has a restriction about running their code compiled
with their tools in an FPGA softcore. So what is your project?

I wasn't referring to a specific project, just your claim that FPGA can
do the same job as processors at the same price.

One project will probably require something of the caliber of a
MSP430F6733. Whether this kind or a DSP, what is key is that we are able
to use pre-coooked library routines. In my case for complex (I/Q) signal
processing, FFT, non-linear filtering and so on. Sometimes legacy
routines must be kept and in an FPGA that would require an IP block that
can emulate the respective processor well enough.

But what I see most is this: The respective client has in-house talent
for writing code. They are familiar with a particular architecture, have
a vast arsenal of re-usable code built up, and naturally they do not
wish to give this up. If it's a dsPIC like last year, then that goes in
there. If it's Atmel and MSP430 like this year, then that is used. Has
to be, the customer is king.

[...]
 
J

Joerg

rickman said:
At Digikey search on iCE40. That will give you a listing of all the
parts. Looks like Digikey's prices are a little high, not just because
they are Digikey. The $3 figure I gave is from quotes I got from
Silicon Blue before they were traded through Digikey. ...


Which part number?

I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.

... Like I said, you
need to get a quote on any FPGA to find the "real" price.

I don't play those games. Or only once every 20-some years when buying a
car, and then the car dealer may develop an ulcer when we are done.
I don't think it is all about the "haggle". I think no small part is
about doing what it takes to get the design in and deny the competition.
When you get quotes from Digikey web site they never even know you are
looking at their parts. Think of it as a live auction vs. sealed bids.
They prefer to be in the live auction.

Well, most design engineers don't. tti lost a deal on capacitors last
week because they wouldn't let me look at the details of a cap without
entering some stupid captcha. Which as usual didn't work. I moved on and
found it elsewhere. I simply do not have the time for such games.
 
J

Joerg

Ulf Samuelsson wrote:

[...]
I came to the conclusion that an MCU company cannot compete with an FPGA
company unless the programmable logic part is only a small percentage of
the die.

They can compete, with the peripherals. But those have to be of good
performance and most of all versatile enough. Some things can probably
be achieved in simple ways. For example, by providing the option to map
two pins per ADC channel for folks who need it to be low noise.

It would make sense to have a 5 kgate FPGAs in a modern micro.

Bingo! That would be fantastic.

The Xilinx Zync and Altera SoC are nice tries, except that most
of the standard peripherals suck and it is still beyond the price
most customers I've seen are prepared to pay.

I hope this doesn't rub you the wrong way but Atmel often ain't that
great either when it comes to peripherals. When I looked at the
"reference" in the ATMega2560 that we just designed in I almost got
sick. In the relayout we piped in our own reference.
 
P

Paul Rubin

Joerg said:
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.

I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.
 
R

rickman

It doesn't have the single access RAM but it does have 64k dual access
RAM. That's a lot of RAM in embedded.

You do this often. Start talking about one thing and shift the context
to another. Projects have design requirements. I often am able to meet
my design requirements with an FPGA and no MCU. I often can't say the
opposite, being able to use an MCU without the FPGA.

You can use the bootloader or OTP your own bootloader if you don't want
to store your programming in ROM. In most situations this is part of a
larger computerized system from where it can download its programming.

That's a different wrinkle. It is common to have a micro load an FPGA,
many don't contain their own Flash. But I haven't seen this done with
DSPs as often and almost never with MCUs. But if that works for your
project, great. You certainly wouldn't have a problem loading an FPGA
then.

I gave a part number. Still waiting for your $3 FPGA part number :)

Actually you gave me two part numbers, one for $5 and one for just over
$3. What's your point? I gave you info to find the iCE40 line. Xilinx
also makes FPGAs that are very affordable and does Altera and Lattice.

So then, what would be an FPGA that can fully contain the above TMS320
and what does it cost?

Which one could replace the MSP430F6733 including its ADCs and all, for
around $3?

http://www.ti.com/lit/ds/symlink/msp430f6720.pdf

I have already explained that I would never do a design in an FPGA to
wholly incorporate a DSP or MCU. That would be absurd. So why do you
keep asking about that?

Depending on your design requirements there are any number of FPGAs that
will do the job and some may be $3. What are your design requirements?

Yes, for myself I'd say it's>90%. For firmware and software support,
much lower percentage. Currently<50% of cases. Meaning I (or some other
folks) and the programmers have to literally work side-by-side.

Mostly this simply has to do with the fact that gear is too large to
ship and often also because operation by someone not used to it can
become dangerous. When pressures are several thousand psi and someone
screws up people could die.

I understand the concept of work that can't be moved. You don't need to
continue to explain that. I was asking why you said most of your word
didn't have that requirement and yet you still were debating the point.
Now I get it, you are talking about two different things, work that
can be moved and work that can't be moved.

One big advantage to FPGAs is the ability to simulate at a low level. it
is very easy to emulate the I/O in an HDL simulation. I don't know how
they do that with MCUs, but in the FPGA world I've never had a problem.


That is undoubtedly true. Plus probably less in errata headaches.

[...]


I've used schematic based tools and both VHDL and Verilog. I've
worked
with the vendor's tools and third party tools including the NeoCAD
tools
which became Xilinx tools when Xilinx bought them.

If anyone tells you they only know one brand of FPGA you are
talking to
an FPGA weenie. I find MCUs to vary a *great* deal more than FPGAs in
terms of usage. MCUs need all sorts of start up code and peripheral
drivers, clock control, etc, etc, etc. FPGAs not so much. They
mostly
have the same features and most of that can be inferred from the
HDL so
you never need to look too hard under the hood.

In short, there is a lot of FUD about FPGAs. Talk to someone who
doesn't buy into the FUD.


Yeah, maybe next time I need major logic we should talk. But better not
about politics :)

Yeah... Even if you need minor logic let's talk. I find the real
advantage to FPGAs is in the fact that I can do almost *anything* in
one. I don't need to add stuff unless it is very application specific
like the CD quality CODEC on my current production board. I could get
CD quality from an FPGA design, but it wouldn't be worth the work to
save a $3 part.


So far the next project on the books that will contain logic is 2nd half
of next year. It will need a 16-bit audio codec as well but we can use
an external chip for that, PCM29xx series et cetera. Would (probably)
even keep it external with a uC because the 16-bit ADCs on those aren't
very quiet.

Not sure what the requirements are for your CODEC, but I have been using
the AKM parts with good results. Minimal or *no* programming,
configuration is done by a small number of select pins, very simple. I
have yet to find another one as small, AK4552, AK4556.

Plus their prices are quite good.

Which, AKM or the other? I'd like to think I can get a CD quality CODEC
for $3 from nearly anyone. I mainly picked AKM because of the size, 6x6
mm without going to a microBGA.

As long as I can get into the tens of ksps.

High 10's or low 10's. Up to say, 20 or 30 ksps is easy to do in an
FPGA with decent resolution, 12 bits. Getting 16 bits is harder, I've
not tried it, but should be possible.

I was looking at using a LVDS input for a comparator and Xilinx did it
in a demo of a SDR. They are very short on details, but they talk about
1 mV on the input. I know that's not anything special, I'm hoping to do
better, much better.

I wasn't referring to a specific project, just your claim that FPGA can
do the same job as processors at the same price.

Yes, that is my claim. The obvious exception is when some feature is
needed that just isn't available in an FPGA. I'm not saying *every*
project can be done better in an FPGA. I'm saying that designers tend
to just not consider FPGAs when they are often viable solutions.

One project will probably require something of the caliber of a
MSP430F6733. Whether this kind or a DSP, what is key is that we are able
to use pre-coooked library routines. In my case for complex (I/Q) signal
processing, FFT, non-linear filtering and so on. Sometimes legacy
routines must be kept and in an FPGA that would require an IP block that
can emulate the respective processor well enough.

Ok, that is likely a no-go. If you really want to emulate a DSP chip
then an FPGA is not likely to be a useful way to proceed. Wanting to
run DSP precompiled library code is a bit of an extreme requirement. If
the customer wants a DSP, then by all means give them a DSP. But don't
automatically exclude an FPGA from the task.

But what I see most is this: The respective client has in-house talent
for writing code. They are familiar with a particular architecture, have
a vast arsenal of re-usable code built up, and naturally they do not
wish to give this up. If it's a dsPIC like last year, then that goes in
there. If it's Atmel and MSP430 like this year, then that is used. Has
to be, the customer is king.

Yeah, well that is a deal killer for *any* other alternative. That is
not related to what I was saying. My point is that if you don't have
any specific requirement that dictates the use of a given chip, an FPGA
has as good a chance at meeting the requirements as an MCU or DSP. In
fact, FPGAs are what get used when DSPs aren't fast enough. My point is
you don't have to limit them to the high end. They also do very well at
the low end.
 
J

Joerg

Paul said:
I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.


Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.

It's like with Excel. There's better programs out there but one core
reason why companies stick to it is the huge repository of VBA
applications they have created over the years.
 
R

rickman

I had thought the parts of those processors that would bloat up badly
(instruction decode etc.) are pretty simple so the overall effect of the
bloat is ok in the scheme of things. The parts doing the most work
(memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks,
adders connected to the LUT's somehow) as efficiently as on the MCU's.

I do think softcores seem like a silly idea a lot of the time, and am
looking forward to more low end FPGA's with MCU blocks.

How about an MCU array instead?

http://www.greenarraychips.com/

BTW, considering a softcore "silly" is not a useful engineering
analysis. Softcores can save money and/or time in a project. But then
people often have biases and aren't willing to look at things from a
different perspective.

Bernd Paysan rolled his own small processor design for an ASIC and was
able to save time *and* money in development as well as silicon cost
over using a canned core in a custom ASIC design. Dig around a bit and
find the B16 for more details. Or he would be happy to relate the
fiasco of the whole project (it started with an 8051 core).
 
R

rickman

Much of it has to do with legacy code. Yes, some things could even be
done more efficiently in the FPGA because you can actually streamline
the HW to the task, something neither uC nor DSP allow. For example, why
have a 32-bit HW multiplier when you know you'll never exceed 23 bits?
But legacy code won't run anymore and you need FPGA specialists to make
it all work.

No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

But that is exactly the point of the FPGA in DSP apps. You code to the
app, not to a processor.
 
P

Paul Rubin

rickman said:
How about an MCU array instead? http://www.greenarraychips.com/

Yes, we've had many discussions about that part ;-).
considering a softcore "silly" is not a useful engineering analysis.

The engineering analysis is implied: it takes far more silicon to
implement a microprocessor in LUTs than directly in silicon, plus you
lose a lot of speed because of all the additional layers and lookups.
Bernd Paysan rolled his own small processor design for an ASIC

Yes, the ASIC bypassed the relative inefficiency of doing the same thing
in FPGA's. It would be cool to have some tiny processors like that
available as hard cells on small FPGA's.
 
J

Joerg

rickman said:
You do this often. Start talking about one thing and shift the context
to another. ...


I didn't. I said it is a DSP with large memory, which it is.

... Projects have design requirements. I often am able to meet
my design requirements with an FPGA and no MCU. I often can't say the
opposite, being able to use an MCU without the FPGA.



That's a different wrinkle. It is common to have a micro load an FPGA,
many don't contain their own Flash. But I haven't seen this done with
DSPs as often and almost never with MCUs. But if that works for your
project, great. You certainly wouldn't have a problem loading an FPGA
then.

The fact that most FPGA don't have flash is fine ... but ... there must
be a decent bootloader inside. In one of the upcoming projects it must
be able to bootload via USB. So the device must wake up with a certain
minimum in brain functionality to handle the USB stuff. With FPGA that
can become a challenge unless you provide a serial memory device (which
adds cost).
Actually you gave me two part numbers, one for $5 and one for just over
$3. What's your point? I gave you info to find the iCE40 line. Xilinx
also makes FPGAs that are very affordable and does Altera and Lattice.

Both of the ones I gave you are $3. The DSP costs $3.02 and the MSP430
is $3.09. These are over-the-counter no-haggle prices. Can a $3 iCE40
device emulate a TMS320 or a big MSP430? I can't tell because I don't
know this Lattice series and I am not an FPGA expert. But it sure looks
like they'd have a hard time.
I have already explained that I would never do a design in an FPGA to
wholly incorporate a DSP or MCU. That would be absurd. So why do you
keep asking about that?

Because you wrote yesterday, quote "For $3 however, you can get a chip
large enough for a CPU (with math) and room for your special logic".

Depending on your design requirements there are any number of FPGAs that
will do the job and some may be $3. What are your design requirements?

As I said, I do not have any hammered out ones yet but it'll come. This
was just about you $3 claim. So I gave some examples of devices that
cost $3.
I understand the concept of work that can't be moved. You don't need to
continue to explain that. I was asking why you said most of your word
didn't have that requirement and yet you still were debating the point.
Now I get it, you are talking about two different things, work that can
be moved and work that can't be moved.

Yup. Hence the need for availability of local programmer talent. Less
local availability means potential problems. That is because (where
possible) I like to use architectures I am familiar with.

Programmer talent means longterm. For example, if a client has an issue
with an 8051 design long after the original programmer has moved on I
could find programmers within a 10-mile radius. Try that with an FPGA.
In San Jose it may be possible but not out here.

[...]

Which, AKM or the other? I'd like to think I can get a CD quality CODEC
for $3 from nearly anyone. I mainly picked AKM because of the size, 6x6
mm without going to a microBGA.

AKM has good prices.
High 10's or low 10's. Up to say, 20 or 30 ksps is easy to do in an
FPGA with decent resolution, 12 bits. Getting 16 bits is harder, I've
not tried it, but should be possible.

Mostly I need 40-50ksps. But 20 is often ok.

I was looking at using a LVDS input for a comparator and Xilinx did it
in a demo of a SDR. They are very short on details, but they talk about
1 mV on the input. I know that's not anything special, I'm hoping to do
better, much better.

If you can keep substrate noise in check it could work. Try to remain
fully differential as much as you can. Not sure if FPGA design suites
still let you hand-place blocks so you can avoid making a big racket
right next to the ADC area.
Yes, that is my claim. The obvious exception is when some feature is
needed that just isn't available in an FPGA. I'm not saying *every*
project can be done better in an FPGA. I'm saying that designers tend
to just not consider FPGAs when they are often viable solutions.

In most of my apps I need much of the functionality that a decent uC
affords, like the $3 device from the MSP430 series I mentioned.
Ok, that is likely a no-go. If you really want to emulate a DSP chip
then an FPGA is not likely to be a useful way to proceed. Wanting to
run DSP precompiled library code is a bit of an extreme requirement. If
the customer wants a DSP, then by all means give them a DSP. But don't
automatically exclude an FPGA from the task.

Sometimes it would also be ok if there were similar pre-cooked FPGA
routines (I/Q signal processing, non-linear filters et cetera).
Yeah, well that is a deal killer for *any* other alternative. That is
not related to what I was saying. My point is that if you don't have
any specific requirement that dictates the use of a given chip, an FPGA
has as good a chance at meeting the requirements as an MCU or DSP. In
fact, FPGAs are what get used when DSPs aren't fast enough. My point is
you don't have to limit them to the high end. They also do very well at
the low end.

No disagreement there, programables have come a long way since the days
of GALs. Which I never used because they were expensive power guzzlers.

One other challenge that needs to be met in most of my cases is
longevity of the design. A FPGA would have to remain available for more
than just a few years. For example, one of my uC-based designs from the
mid 90's is still in production. Since I kind of had a hunch that this
would happen I used an 8051 family uC. Is there something similar in the
world of FPGA?
 
T

Tim Williams

Joerg said:
One other challenge that needs to be met in most of my cases is
longevity of the design. A FPGA would have to remain available for more
than just a few years. For example, one of my uC-based designs from the
mid 90's is still in production. Since I kind of had a hunch that this
would happen I used an 8051 family uC. Is there something similar in the
world of FPGA?

Interjecting what little experience I have here;

MAX7000s are still available, and they were introduced in 1995;
http://www.altera.com/devices/cpld/max-about/max-about.html
although I think they've finally gone out of stock (not that you'd want to
use them, they also guzzle power like they're NMOS).

FLEX 10K FPGAs don't appear on their website, but they're still quite
available. Can't seem to find when they were introduced? I'm seeing
documents since at least 1996 referencing them.

At least between the big two (Altera that I know of, and I would assume
Xilinx too), FPGAs look to have excellent support over time.

And VHDL doesn't age. The library blocks might, but I'd be willing to
guess that those are synthesized as well, so as long as your toolchain
supports whatever code format, you can migrate chips easily when they do
finally die.

And really, if you have to support something for over 20 years, it's
probably time it does die and gets a redesign. Like that VAX or whatever
it was NASA's still supporting.

Tim
 
J

Joerg

rickman said:
No, you would need a DSP specialist. The FPGA designer only needs to
know how to code the FPGA.

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.

But that is exactly the point of the FPGA in DSP apps. You code to the
app, not to a processor.

How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.
 
J

Joerg

Tim said:
Interjecting what little experience I have here;

MAX7000s are still available, and they were introduced in 1995;
http://www.altera.com/devices/cpld/max-about/max-about.html
although I think they've finally gone out of stock (not that you'd want to
use them, they also guzzle power like they're NMOS).

FLEX 10K FPGAs don't appear on their website, but they're still quite
available. Can't seem to find when they were introduced? I'm seeing
documents since at least 1996 referencing them.

Ok, sorry, forgot to say that I never use that manufacturer.

At least between the big two (Altera that I know of, and I would assume
Xilinx too), FPGAs look to have excellent support over time.

And VHDL doesn't age. The library blocks might, but I'd be willing to
guess that those are synthesized as well, so as long as your toolchain
supports whatever code format, you can migrate chips easily when they do
finally die.

But the minute a footprint changes or you have to re-compile you are
screwed in some heavily regulated markets.

And really, if you have to support something for over 20 years, it's
probably time it does die and gets a redesign. ...


Nope. Not if it's in the worlds of medical or aerospace. There you have
a huge re-cert effort on your hands for changes. New layout? Back to the
end of the line.

You also need to support very old legacy equipment with type-certified
spare parts and for those few sales you really don't want a re-cert.
Just think about the DC-3 which is still in service. Some of those are
over 60 years old. Can be worse with elevators. We have a company near
here that sometimes has to support elevators that are over a century old.

... Like that VAX or whatever
it was NASA's still supporting.

Sometimes changing is very time consuming. I recently learned that this
is even the case for alarm systems. "If we even add as much as one
capacitor for EMC we have to go through the whole insurer certification
process again".
 
S

Stef

In comp.arch.embedded,
Joerg said:
Nope. Not if it's in the worlds of medical or aerospace. There you have
a huge re-cert effort on your hands for changes. New layout? Back to the
end of the line.

That is not always true (at least for medical equipment, no experience
with aerospace). If the change is minor enough, it may be enough to
write a rationale that explains the change and how it does not impact
the function of the equipment. If the notified body agrees with the
rationale, only a limitied effort is required to re-cert.
Sometimes changing is very time consuming. I recently learned that this
is even the case for alarm systems. "If we even add as much as one
capacitor for EMC we have to go through the whole insurer certification
process again".

Weird, I would expect a similar approach with a rationale or something
would be enough.
 
R

rickman

So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.

You can do it anyway you want. I'm just making the distinction between
DSP knowledge and FPGA knowledge. They aren't very much the same. I
also make a distinction between a DSP designer and a DSP coder. Again,
not much in common. Coding doesn't really require a lot of DSP
knowledge and DSP designers often aren't experts at coding the finicky
chips.

How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.

Life span is typically *much* better than MCUs.

First, there are *no* second sources so whatever chip family you select
is the only one that will fit your layout. There *may* be more than one
member of that family that will fit the same socket, that is common, but
not guaranteed. So you often will get a choice of two, three or four
sizes and you often get an upgrade path from your first selection. Just
in case you are familiar with the compilation process, in an FPGA you
*always* have to recompile for the target. Even if they are pin
compatible you can't load a design for an whatever-02 chip into a
whatever-03 part. Those are the limitations.

As to the market life, that it typically well over 10 years. Spartan 3
was introduced some 10 years ago and it is not yet the oldest chip
Xilinx has in current full production. I'm still considering using it
for new designs. Similar situation for Altera. I was just burned by
Lattice announcing EOL of their XP line. This was because they got a
new guy in at the top with a new broom I suppose.

I'm sure you can find various MCUs which have been in production for 10
years, but I know Atmel likes to replace products from time to time with
similar "pin compatible" devices which are 99.9% compatible. I expect
for the 8 bit parts life span is not such an issue. For the larger
parts I expect life span is a bit more limited and for the top end
chips, I'm pretty sure their life span is measured in double digit
months. Can you still buy any of the Pentium 4s that were all over the
place seven or eight years ago? I can't even find a Core 2 Duo.

What lifespan have you seen for MCUs?
 
R

rickman

Yes, we've had many discussions about that part ;-).


The engineering analysis is implied: it takes far more silicon to
implement a microprocessor in LUTs than directly in silicon, plus you
lose a lot of speed because of all the additional layers and lookups.

That is a pointless comparison. I have never once opened up a chip to
see how much silicon it used. I compare the things I can see from the
outside, cost, power consumption, etc... You can infer anything you
wish. The proof of the pudding is in the eating.

This exactly the type of bias I'd like to overcome.

Yes, the ASIC bypassed the relative inefficiency of doing the same thing
in FPGA's. It would be cool to have some tiny processors like that
available as hard cells on small FPGA's.

Ok, but your "efficiency" rating is not any of real value in a design.
Stop limiting yourself by pointless metrics. If you like the idea of a
lot of processors on a chip, then design one on an FPGA and see how it
works.

Do you see what I'm trying to say?
 
Top