Maker Pro
Maker Pro

AREF bypass capacitance on ATMega2560?

P

Paul Rubin

rickman said:
That is a pointless comparison. I have never once opened up a chip to
see how much silicon it used. I compare the things I can see from the
outside, cost, power consumption, etc...

Yes, and those are quite closely dependent on the amount of silicon used.
You can infer anything you wish. The proof of the pudding is in the
eating.

OK. That GA144 you mentioned has 144 cpu nodes made in a rather old
process technology (0.18 micron, I guess 1990's vintage). They still
manage to run the thing at 700+ Mhz, keep power consumption to around
0.5W with all cpu's running full speed, and sell it for $20 in small
quantity. Can you do anyting like that with an FPGA? What will it
cost? How much power will it use? I'll accept the b16 as a comparable
processor to a GA144 node. Bernd's paper mentions the b16 ran at 25 MHz
in a Flex10K30E, a 30-to-1 slowdown, power consumption not mentioned.
But I don't know how the underlying silicon processes compare.
 
R

rickman

I didn't. I said it is a DSP with large memory, which it is.

You first give a part, the C5535, as the chip with big memory, then it
becomes the C5532 which is less memory and less expensive. I can't tell
what you are talking about when the subject changes.

The fact that most FPGA don't have flash is fine ... but ... there must
be a decent bootloader inside. In one of the upcoming projects it must
be able to bootload via USB. So the device must wake up with a certain
minimum in brain functionality to handle the USB stuff. With FPGA that
can become a challenge unless you provide a serial memory device (which
adds cost).

No, you won't find any FPGAs which can wake up talking over USB. But
you will find FPGAs with internal Flash if you wish to design a USB
bootloader.

Both of the ones I gave you are $3. The DSP costs $3.02 and the MSP430
is $3.09. These are over-the-counter no-haggle prices. Can a $3 iCE40
device emulate a TMS320 or a big MSP430? I can't tell because I don't
know this Lattice series and I am not an FPGA expert. But it sure looks
like they'd have a hard time.

No, the C5535 part is not $3. That is what I mean by two part numbers.

Because you wrote yesterday, quote "For $3 however, you can get a chip
large enough for a CPU (with math) and room for your special logic".

I said "a CPU" not "any CPU". I never said it would duplicate a
commercial device. I'm talking about function.

As I said, I do not have any hammered out ones yet but it'll come. This
was just about you $3 claim. So I gave some examples of devices that
cost $3.

Yes, and there are FPGAs in that price range which can be used to
implement a CPU plus other logic.

Yup. Hence the need for availability of local programmer talent. Less
local availability means potential problems. That is because (where
possible) I like to use architectures I am familiar with.

Programmer talent means longterm. For example, if a client has an issue
with an 8051 design long after the original programmer has moved on I
could find programmers within a 10-mile radius. Try that with an FPGA.
In San Jose it may be possible but not out here.

I can't speak of your environment. I know my friend of many years
stayed away from FPGAs in spite of the fact that he is a very capable
designer. He finally paid me for a week of FPGA design work which I
then turned over to him and helped him get started with HDL. It's not
hard at all. You don't really need anyone special. That is the sort of
thinking I am trying to dispel.

Another example. A software designer came to a newsgroup looking for
info on programming FPGAs. He used the mindset of a software guy and
wanted to do a "hello world" program. We tried to explain to him that
hardware isn't software and HDL isn't C. But he persisted and I gave
him advice over a week or so. I tried to turn it into a consulting gig
but his bosses didn't want to pay the bucks. He ended up doing just
fine with his software mindset and convinced his boss to pay me $500
over my protests. I cashed the check when it came.

The point is that FPGAs are not so hard that you need a unique talent to
design them. That may have been true 10+ years ago, but they are very
mainstream now and much easier to work with. I bet even *you* could do
an FPGA design, lol.

I don't care where you are located, if you can't find an FPGA designer,
you aren't looking very hard.

AKM has good prices.

Ok. I have no complaints on prices. Their lead time can be a problem.
I had a conversation, disti, manufacturers guy and me. I was
complaining about a 14 week lead time and he bragged that a 14 week lead
time was *good*. I give my customers a 10 week lead time... see the
problem? Digikey sell them now so it is not such an issue. I even
ended up speaking with a buyer or planner who was coordinating the
shipment of an order last spring. Once you reach them they are very nice.

Mostly I need 40-50ksps. But 20 is often ok.

I haven't done 12 bits at 50 ksps, but I expect it is doable. Just
cross the t's and dot the i's.

If you can keep substrate noise in check it could work. Try to remain
fully differential as much as you can. Not sure if FPGA design suites
still let you hand-place blocks so you can avoid making a big racket
right next to the ADC area.

*Everything* in an FPGA makes noise, it's all digital. Yes you can hand
place logic if you want. That is the sort of thing best done at the end
if possible when you are ready to finalize the chip. But what would you
have in an FPGA design that makes more noise than anything else? Each
logic block is very small and has a pretty low power consumption. It
would be the I/O that has significant power spikes and you have total
control over that.

In most of my apps I need much of the functionality that a decent uC
affords, like the $3 device from the MSP430 series I mentioned.

If you need 256 kB of memory then you won't reach a $3 price tag. If
you need something more like the low end processor you mentioned that
might be doable in the low end FPGAs. They have block RAM, but it
scales with the size of the chip. When you have a specific requirement
we can look and see what matches.

Sometimes it would also be ok if there were similar pre-cooked FPGA
routines (I/Q signal processing, non-linear filters et cetera).

There are design tools that will generate function blocks, filters, etc.
I have not had to deal with them. The DSP stuff I have done I just
coded up in HDL.

No disagreement there, programables have come a long way since the days
of GALs. Which I never used because they were expensive power guzzlers.

One other challenge that needs to be met in most of my cases is
longevity of the design. A FPGA would have to remain available for more
than just a few years. For example, one of my uC-based designs from the
mid 90's is still in production. Since I kind of had a hunch that this
would happen I used an 8051 family uC. Is there something similar in the
world of FPGA?

That is typically not a problem, but pick a device that is relatively
new to start with. The vendors are *all* about their latest and
greatest products. I guess they need a critical mass of design wins up
front which they get revenue from over the life of the part. So they
push the newest stuff and let you ask about the older parts.

I don't think there is anything like the 8051 other than the 22V10
perhaps. The 8051 is an anomaly in the MCU world. You won't see a DSP
equivalent for example. So far users typically want more, more, more
from FPGAs. So a stationary design would not have a market. Even
though there are ever larger markets for low end parts, they keep
redesigning them to make them cheaper. When they do that they add
incompatibility because it doesn't affect the bulk of the users,
recompile and you are good to go. But pin compatibility, no, that just
doesn't exist other than within a single family. Fortunately product
life is typically not an issue.
 
J

Joerg

rickman said:
You can do it anyway you want. I'm just making the distinction between
DSP knowledge and FPGA knowledge. They aren't very much the same. I
also make a distinction between a DSP designer and a DSP coder. Again,
not much in common. Coding doesn't really require a lot of DSP
knowledge and DSP designers often aren't experts at coding the finicky
chips.

I have learned that with uC as well. There are lots of programmers but
not too many who can lay down a realtime program architecture. So I do
that a lot. And I am a guy who cannot really program uCs very easily, I
don't speak much C.

With DSP it's the same and I would expect that also from FPGA. I propose
an architecture, what needs to be calculated, and when. Then a
programmer takes over. But I can't justify more than one person for
that. So if some or a lot of uC or DSP core have to be poured into an
FPGA then I guess the FPGA guys has to take over both.
Life span is typically *much* better than MCUs.

First, there are *no* second sources so whatever chip family you select
is the only one that will fit your layout. ...


That is one of my concerns. With 8051 uCs you have multiple sources as
long as you stick to customary packages such as a 44-pin flat-pack.

... There *may* be more than one
member of that family that will fit the same socket, that is common, but
not guaranteed. So you often will get a choice of two, three or four
sizes and you often get an upgrade path from your first selection. Just
in case you are familiar with the compilation process, in an FPGA you
*always* have to recompile for the target. Even if they are pin
compatible you can't load a design for an whatever-02 chip into a
whatever-03 part. Those are the limitations.

Yeah, that I was aware of. And changing to a whatever-03 would be a
major headache in many of my cases. Because it's medical, aerospace
similar where that can trigger a complete re-cert.

As to the market life, that it typically well over 10 years. Spartan 3
was introduced some 10 years ago and it is not yet the oldest chip
Xilinx has in current full production. I'm still considering using it
for new designs. Similar situation for Altera. ...


Well over 10 years is good. But only if that means no change to any new
versions that require a re-compile. Early on in my career that happened
and one guy promptly got busy with three months of regression testing.
Oh what fun.

... I was just burned by
Lattice announcing EOL of their XP line. This was because they got a
new guy in at the top with a new broom I suppose.

Not so cool :-(

I'm sure you can find various MCUs which have been in production for 10
years, but I know Atmel likes to replace products from time to time with
similar "pin compatible" devices which are 99.9% compatible. I expect
for the 8 bit parts life span is not such an issue. For the larger
parts I expect life span is a bit more limited and for the top end
chips, I'm pretty sure their life span is measured in double digit
months. Can you still buy any of the Pentium 4s that were all over the
place seven or eight years ago?

Yup:

http://components.arrow.com/part/detail/41596500S6440784N2936?region=na


... I can't even find a Core 2 Duo.


No problem either:

http://components.arrow.com/part/detail/42952225S9497728N2936?region=na
What lifespan have you seen for MCUs?

The 89C51 I designed in in the mid-90's is still living. Not sure how
long it was in production when I designed it in. The nice thing is that
these are made by several companies, even Asian ones such as Winbond. So
it was no surprise when I took apart our pellet stove for maintenance
and found one of those in there as well.

2nd source is important to me, and my clients.
 
J

Joerg

Stef said:
In comp.arch.embedded,


That is not always true (at least for medical equipment, no experience
with aerospace). If the change is minor enough, it may be enough to
write a rationale that explains the change and how it does not impact
the function of the equipment. If the notified body agrees with the
rationale, only a limitied effort is required to re-cert.

I really doubt they would agree if a code re-compilation was required to
make this work. With code and firmware they have become very careful
because there have been to many mishaps.

Most of the time the notified bodies or even the FDA do not care much
about the code, they care about your process. So then the onus is on the
company, and there mostly on the VP of Quality Control. He or she will
normally not take a re-compile lightly, or as something that can be
brushed under the carpet as "not too risky".

It is the same with some hardware. I went through a whole re-cert once
just because we had to switcher the manufacturer for one little transformer.

The bottomline is that in the unlikely but possible situation where
something bad happens you need to be prepared. Then there will be a
barrage of request for documents from the regression testing and all
that. Woe to those who then don't have them.

Weird, I would expect a similar approach with a rationale or something
would be enough.

There are many other markets with similar requirements. One of them is
railroad electronics, especially for countries like Germany.
 
P

Paul Rubin

rickman said:
The point is that FPGAs are not so hard that you need a unique talent
to design them. That may have been true 10+ years ago, but they are
very mainstream now and much easier to work with.

There still appears to be a complete absence of FOSS toolchains, at
least for any current interesting parts.
If you need 256 kB of memory then you won't reach a $3 price tag.

That part (MSP430F6733) has 64k of flash and 4k of ram, not out of
reach. It does have some nice other features that may be hard to
duplicate with an fpga, like quite low power consumption:
http://www.ti.com/product/msp430f6733
 
R

rickman

That is one of my concerns. With 8051 uCs you have multiple sources as
long as you stick to customary packages such as a 44-pin flat-pack.

Yes, but 8051s aren't DSPs either are they? You seem to be switching
gears again. I can't keep up. I know you do different designs, but can
the FPGA be wrong for *all* of them? You seem to have all requirements
for all designs.

Yeah, that I was aware of. And changing to a whatever-03 would be a
major headache in many of my cases. Because it's medical, aerospace
similar where that can trigger a complete re-cert.

Why would you need to change to the whatever-03. Once it is qualified
you can stick with it. My point is that you have flexibility in the
device, no one is making you switch.

Well over 10 years is good. But only if that means no change to any new
versions that require a re-compile. Early on in my career that happened
and one guy promptly got busy with three months of regression testing.
Oh what fun.

Why not talk to the vendors?

Not so cool :-(

Yeah, I'm unhappy about it. I thought I could get more development
funds for a redo but the division reselling this board in their product
doesn't want to spend any cash on it. I've been asked to spend my dime
and I likely will. I make good money on this product.


How do you know these are the parts that were designed in the system of
interest? They made a huge number of variants and I know I have seen
EOL notices for Pentium 4s.

The 89C51 I designed in in the mid-90's is still living. Not sure how
long it was in production when I designed it in. The nice thing is that
these are made by several companies, even Asian ones such as Winbond. So
it was no surprise when I took apart our pellet stove for maintenance
and found one of those in there as well.

2nd source is important to me, and my clients.

If you really need that level of consistency, then you will be using
nothing but 8051s all your career. I don't know of any digital
component that has lived as long as the 8051 other than perhaps LS-TTL.
I also don't know of any other MCU that is second sourceds. If the
8051 does what you need, then go for it. But again you are mixing
conversations. That's why it is so frustrating to have a conversation
with you. You talk about not being able to use a part unless it has a
product life as long as the 8051 and then you talk about using various
DSP chips in the same context. I *know* you won't be able to buy those
DSP chips 10 years from now. TI just doesn't provide that level of
support unless they have a special program for long lived parts I'm not
aware of. I've seen lightly selling DSPs drop from the marketplace
after less than 5 years.

The DSP market was just a tiny exploration by TI initially. Then they
saw cell phones as a way to utilize that capability. They actually
reorganized the entire company to take full advantage of it. As a
result they ended up with four segments for DSPs.

1) Cell phone devices - small, low power and cheap in large quantities.
Not much need for longevity at all... basically the C5xxx line.

2) Cell base stations - powerful devices that can handle multiple
channels, power consumption not important and cost is secondary. This
is the C6xxx line. Again, they focus on new, not longevity.

3) Scientific DSP - floating point. C67xx lines. Relatively low
volumes compared to the other two, but they seem to think it is an
important market. New designs are not as frequent. Longevity might be
better than the other two, but no promises.

4) Motor control, white goods, etc - fixed point with price the major
factor. These have appeared in a range of variations, some with flash,
some with ADCs, etc. These are almost MCUs with simlar performance,
slow compared to segment 1 and 2. Intended for high volume apps, but
again, longevity is not important.

So if you are going to consider DSPs for your apps, I expect you would
be looking at the last category. I'm pretty sure I wouldn't be
designing from this group if I wanted to be building this board 10 years
from now though. Have you talked to TI about longevity?
 
R

rickman

There still appears to be a complete absence of FOSS toolchains, at
least for any current interesting parts.

There are no FOSS bit stream generators and there never will be. If
that is a no-go for you, then you will never use FPGAs from any of the
existing companies.

That part (MSP430F6733) has 64k of flash and 4k of ram, not out of
reach. It does have some nice other features that may be hard to
duplicate with an fpga, like quite low power consumption:
http://www.ti.com/product/msp430f6733

You have been reading old books. All FPGAs aren't power hungry. Check
the Lattice site for iCE40 line. Very low power.
 
R

rickman

Yes, and those are quite closely dependent on the amount of silicon used.

Nonsense. If you want to know how much water has collected in the
basement because of a burst pipe, do you call the water authority to
read the meter? No, you put a stick in the water and measure it. If
you want to know a parameter, then measure that parameter, don't infer
it from something only vaguely related.

You have a bias against soft cores because you want to analyze them in a
meaningless way. How about analyzing them in the terms that you care
about?

OK. That GA144 you mentioned has 144 cpu nodes made in a rather old
process technology (0.18 micron, I guess 1990's vintage). They still
manage to run the thing at 700+ Mhz, keep power consumption to around
0.5W with all cpu's running full speed, and sell it for $20 in small
quantity. Can you do anyting like that with an FPGA?

Like what exactly? Do 700 MIPS, of course you can. An FPGA can be
configured to run your algorithm more exactly than any processor and so
it can get very low power.

BTW, you know the GA144 doesn't do 700 MIPS either. It is less than
half that with most code. The GA144 isn't 0.5 Watts either, it is close
to 1 Watt with all nodes running. It also doesn't cost $20 to use
because it requires a *ton* of support devices, boot prom, RAM, clock,
1.8 volt to *everything else* voltage translation, etc...

I actually considered using it in my board redesign. I might have to
add a RAM chip to it, but all the clocks are external to the board
anyway and there is already a low voltage power supply. So the main
issue is the voltage translation which is partly dealt with currently
since the current FPGA had to be buffered to some of the I/O for 5 volt
logic. So the GA144 might do ok in that design. But then there is the
reason I am doing a redesign... the FPGA is EOL. I don't have much
confidence GA will be around in 10 years. Do you know of one major
design win they have had?

What will it
cost?

You haven't told me what the design requirements are... how can I
possibly give you a price?

How much power will it use?

How long is a piece of string?

I'll accept the b16 as a comparable
processor to a GA144 node. Bernd's paper mentions the b16 ran at 25 MHz
in a Flex10K30E, a 30-to-1 slowdown, power consumption not mentioned.
But I don't know how the underlying silicon processes compare.

You are trying to compare apples to horses. No, you can't use an FPGA
to implement some existing processor and improve on cost, power or any
other parameter. I never said you could. That would be like using a
kitchen knife as a razor. It won't work so well and has little value.
But if you have an application - it may well be easier to implement in
an FPGA than in a GA144... in fact, I can almost guarantee that!
 
J

John Devereux

rickman said:
So your use of MCUs is based on inertia?

Partly I suppose.

Or I could say that my projects so far all require a microcontroller
anyway, and it seemed likely that a separate FPGA was always going to be
more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then,
but still...)
A startup company called Silicon Blue came out with a line of FPGAs
targeted to the high volume, low power market that exists for portable
devices. They were preparing their second device family and were
bought by Lattice Semi. The first family was dropped and the second
family is the iCE40 (for 40 nm). They are very low power although
smallish. The largest one has 8 kLUTs, the smallest 384 LUTs.

Last winter I was looking at designing a very low power radio
controlled clock to run in one of these. They were still playing a
shell game with the devices in the lineup and the 640 LUT part I
wanted to use was dropped... :( The only real problem I have with
these devices is the packaging. Because of the target market the
packages are mostly fine pitch BGAs. Great if you are making a cell
phone, not so great if you are designing other equipment.

You can get the 1 kLUT parts for under $3 and possibly the 4 kLUT
parts. It has been a while since I got a quote. The 1 kLUT part is
big enough for a soft core MCU plus some custom logic.

OK, thanks, will check them out.
 
R

rickman

Yes, basically. "a lot" being only e.g. about 64k probably, not much for
a MCU but would push the price up for an FPGA I think.


I'm pretty sure that a FPGA with enough RAM would be far too expensive
(compared to the $3 200 MIPS CPU).

I won't pretend that an FPGA is the right solution for every task. But
I think MCUs are often used because that is what the designer is used to
and FPGAs aren't understood well enough to consider. Is "enough" RAM
more than what a given FPGA has? I don't know, how much RAM do you
really need? Most MCU projects I have worked on never had a realistic
RAM estimate, it was all by the seat of the pants. The fact that code
uses RAM makes it harder to estimate. FPGAs are a lot easier to design
with in that regard. RAM quantities have to be known exactly. LUT
counts have to be estimated though, so its not totally different.

A M3 or M4 with attached FPGA + memories would be interesting, if it was
at a reasonable price.

Or even an AVR... are you reading Ulf? I think the requirements for
MCUs are often overstated. Most of the sort of work I do could be done
with an 8051 (ugh!) if one of the higher performance devices especially,
but I often don't have the real estate for a separate MCU unless I can
treat it as an I/O expander.

NXP have a M4 with attached M0 which sort of goes in that direction; the
M0 does the more deterministic simple stuff, the M4 does the number
crunching and runs the more complicated software.

Hell, I'd be estatic if they provided FPGAs in small enough packages so
I can use a 32 pin QFN for an MCU and the same footprint for an FPGA.
Well, Lattice *does* put an XO2 in a 32 QFN, but only 256 LUTs which is
not big enough for much. Why not 1 or 2 or 4 kLUT? For some reason
FPGA vendors all think you need more I/O and less LUTs.

But could you give an example of your $3 one? Or a favorite?
[...]
You can get the 1 kLUT parts for under $3 and possibly the 4 kLUT
parts. It has been a while since I got a quote. The 1 kLUT part is
big enough for a soft core MCU plus some custom logic.

OK, thanks, will check them out.

I haven't gotten a quote on these parts since they were bought by
Lattice. I'd appreciate a pricing update if you get one. They should
be able to do a lot better than the Digikey price, I know Xilinx and
Altera always do. Heck, the Digikey pricing for most FPGAs doesn't go
above qty 1... if nothing else there should be some quantity price
breaks.

Unfortunately I don't really have a live application, so would only be
able to buy them as "education" at this stage.

I got a freebie eval board for the iCE40 but haven't fired it up. I
want to measure some power consumption numbers. The data sheets changed
the static current a while back, well after they had been out, just
after Lattice bought SiliconBlue so I'm not sure what that was about.
The 1 kLUT part went from around 40 uA to 100 uA quiescent current. The
dynamic current is still very low though, single digit mA with the
device full of 16 bit counters running at 32 MHz. But they seem to have
removed that data when they changed data sheet formats.
 
J

Joerg

rickman said:
You first give a part, the C5535, as the chip with big memory, then it
becomes the C5532 which is less memory and less expensive. I can't tell
what you are talking about when the subject changes.

There are no subject changes. Did you even click on the link? The
datasheet is for the _whole_ series, _including_ the 5532. It clearly
says so in the first line on the first page.

[...]
No, you won't find any FPGAs which can wake up talking over USB. But
you will find FPGAs with internal Flash if you wish to design a USB
bootloader.

Then one of those would be required I guess. USB connectivity is
important these days.
No, the C5535 part is not $3. That is what I mean by two part numbers.

The C5532 is $3. That is the part in the Digikey link I gave. Datasheets
are often for a whole series, economy to deluxe. I thought that became
clear when you looked at the datasheet.

[...]

Yes, and there are FPGAs in that price range which can be used to
implement a CPU plus other logic.

Well, yeah, but we were talking about an appropriate and similarly
classed CPU, not a 30c 8-bitter from China.
I can't speak of your environment. I know my friend of many years
stayed away from FPGAs in spite of the fact that he is a very capable
designer. He finally paid me for a week of FPGA design work which I
then turned over to him and helped him get started with HDL. It's not
hard at all. You don't really need anyone special. That is the sort of
thinking I am trying to dispel.

For you it may be easy. I am somehow not the kind of guy that easily
learns programming languages. Human languages, yes. Really weird analog
or RF tricks, yes. C, C++ or HDL, not really. I can read through code to
some extent but it is like having to plow through a document in
Portuguese (which I had to do).

Another example. A software designer came to a newsgroup looking for
info on programming FPGAs. He used the mindset of a software guy and
wanted to do a "hello world" program. We tried to explain to him that
hardware isn't software and HDL isn't C. But he persisted and I gave
him advice over a week or so. I tried to turn it into a consulting gig
but his bosses didn't want to pay the bucks. He ended up doing just
fine with his software mindset and convinced his boss to pay me $500
over my protests. I cashed the check when it came.

The point is that FPGAs are not so hard that you need a unique talent to
design them. That may have been true 10+ years ago, but they are very
mainstream now and much easier to work with. I bet even *you* could do
an FPGA design, lol.

Maybe, but it'll take a while. I did some uC programming though so maybe
that helps.

I don't care where you are located, if you can't find an FPGA designer,
you aren't looking very hard.

In Cameron Park? Most if not all FPGA guys out here work for Intel, they
won't have time for consulting gigs and may not even be allowed to do it.
Ok. I have no complaints on prices. Their lead time can be a problem.
I had a conversation, disti, manufacturers guy and me. I was
complaining about a 14 week lead time and he bragged that a 14 week lead
time was *good*. I give my customers a 10 week lead time... see the
problem? Digikey sell them now so it is not such an issue. I even
ended up speaking with a buyer or planner who was coordinating the
shipment of an order last spring. Once you reach them they are very nice.

Yes, Digikey has them. My rule is that if Digikey doesn't have something
I try to avoid the part. Except for Coilcraft.
I haven't done 12 bits at 50 ksps, but I expect it is doable. Just
cross the t's and dot the i's.

It's not just getting it done in principle but also to yield at least
10.5bits ENOB or so at that speed. Even with uC that can be a challenge.
*Everything* in an FPGA makes noise, it's all digital. Yes you can hand
place logic if you want. That is the sort of thing best done at the end
if possible when you are ready to finalize the chip. But what would you
have in an FPGA design that makes more noise than anything else? Each
logic block is very small and has a pretty low power consumption. It
would be the I/O that has significant power spikes and you have total
control over that.

What sometimes causes issues are FLL or PLL in there to create the
master clock. But I only know that from uC. With FPGA we had EMI
problems and sometimes they required unorthodox measures. Had the same
thing with a discrete RAM bank: We had to run other parts in the FPGA as
dummy loads, ping-pong style, to reduce the noise energy. Their FPGA guy
almost threw me out of his cubicle when I suggested that but then it
worked. He bought me a coffee at the cantina :)
If you need 256 kB of memory then you won't reach a $3 price tag. If
you need something more like the low end processor you mentioned that
might be doable in the low end FPGAs. They have block RAM, but it
scales with the size of the chip. When you have a specific requirement
we can look and see what matches.

It'll be a while until I know for sure. Because whether or not I need
some massive compensator routine depends on the performance of a
complicated mechanical part that we won't have before spring next year.

[...]

That is typically not a problem, but pick a device that is relatively
new to start with. The vendors are *all* about their latest and
greatest products. I guess they need a critical mass of design wins up
front which they get revenue from over the life of the part. So they
push the newest stuff and let you ask about the older parts.

As long as they do not require a formal RFQ from the (not yet existing)
purchasing department of the company. Then I'd walk. No kidding, this
happened on a programmable device in the 90's.

I don't think there is anything like the 8051 other than the 22V10
perhaps. The 8051 is an anomaly in the MCU world. ...


Not an anomaly, it was bound to happen. There are many areas where 2nd
source is a must. The usual paranoia by manufacturers that this puts
downward pressure on the price was debunked by this very uC. It is the
only gripe I have with it, that it is expensive compared to more modern
ones. But you have no choice if there must be a 2nd source and they know it.

So it was also not very surprising that "Hayabusa editions" came out,
screaming along around 100MHz.

... You won't see a DSP
equivalent for example. So far users typically want more, more, more
from FPGAs. So a stationary design would not have a market. Even
though there are ever larger markets for low end parts, they keep
redesigning them to make them cheaper. When they do that they add
incompatibility because it doesn't affect the bulk of the users,
recompile and you are good to go. But pin compatibility, no, that just
doesn't exist other than within a single family. Fortunately product
life is typically not an issue.

It is for me. In the old days we preferred Analog Devices DSP (2110?
Forgot the part number) because they were the staple. We just used lots
of them per board. Cheap, too.

Come to think of it, most of my designs where there was a DSP it cost
around $5-10, nowadays they are down to around $3. It's not very
expensive anymore but finding a programmer to work locally can be tough.
 
J

Joerg

rickman said:
Yes, but 8051s aren't DSPs either are they? You seem to be switching
gears again. I can't keep up. I know you do different designs, but can
the FPGA be wrong for *all* of them? You seem to have all requirements
for all designs.

I do various designs, sometimes simultaneously. For DSP we often just
plop down a TMS320 bare-bones edition and be done with it. It's like
buying a Ford F-150 for the ranch, it may be too big but it is not
expensive and you almost can't go wrong with it. I had designs where the
DSP workload ended up at 5% but at $3 pop nobody was concerned.

And no, mostly I don't even have the requirements until after the
project already started. Sometimes weeks down the road the sensor guys
call in, "Houston, we have a problem". This is the kind of project
companies like to use consultants for, since it can be utterly
frustrating for engineers. Us guys are use to this stuff.
Why would you need to change to the whatever-03. Once it is qualified
you can stick with it. My point is that you have flexibility in the
device, no one is making you switch.

I was thinking about the case where whatever-02 becomes unobtanium.
Why not talk to the vendors?

We did that and all we got was a "Sorry about that". The designed-in
device was discontinued.
Yeah, I'm unhappy about it. I thought I could get more development
funds for a redo but the division reselling this board in their product
doesn't want to spend any cash on it. I've been asked to spend my dime
and I likely will. I make good money on this product.

Looks like a good business opportunity for you :)
How do you know these are the parts that were designed in the system of
interest? They made a huge number of variants and I know I have seen
EOL notices for Pentium 4s.

Well, you do have to look in the schematics. You only asked whether one
can still buy Pentium 4 and I said yes, and gave evidence. Are you
changing the game now? :)

Legacy stuff in the PC world does not go away fast. To this day you can
still easily buy brand-new ISA-bus PCs. Because scores of them are used
in production facilities. I helped replace one a few years ago and it
also had a processor from the days of Methusaleh.
If you really need that level of consistency, then you will be using
nothing but 8051s all your career. I don't know of any digital
component that has lived as long as the 8051 other than perhaps LS-TTL.
I also don't know of any other MCU that is second sourceds. If the
8051 does what you need, then go for it. But again you are mixing
conversations. That's why it is so frustrating to have a conversation
with you. ...


I merely said it matter in some cases. Not in all cases.

... You talk about not being able to use a part unless it has a
product life as long as the 8051 and then you talk about using various
DSP chips in the same context. I *know* you won't be able to buy those
DSP chips 10 years from now. TI just doesn't provide that level of
support unless they have a special program for long lived parts I'm not
aware of. I've seen lightly selling DSPs drop from the marketplace
after less than 5 years.

Well, let me show you a blast from the past ...

http://www.rocelec.com/search/finis...ins/?utm_source=supplyFrame&utm_medium=buyNow

20,286 in stock, ready to ship.

The DSP market was just a tiny exploration by TI initially. Then they
saw cell phones as a way to utilize that capability. They actually
reorganized the entire company to take full advantage of it. As a
result they ended up with four segments for DSPs.

Analog Devices had the market first, they really ruled in the eearly
90's. We had boards with about a dozen 16-bit FP DSPs on there.

1) Cell phone devices - small, low power and cheap in large quantities.
Not much need for longevity at all... basically the C5xxx line.

2) Cell base stations - powerful devices that can handle multiple
channels, power consumption not important and cost is secondary. This
is the C6xxx line. Again, they focus on new, not longevity.

3) Scientific DSP - floating point. C67xx lines. Relatively low
volumes compared to the other two, but they seem to think it is an
important market. New designs are not as frequent. Longevity might be
better than the other two, but no promises.

4) Motor control, white goods, etc - fixed point with price the major
factor. These have appeared in a range of variations, some with flash,
some with ADCs, etc. These are almost MCUs with simlar performance,
slow compared to segment 1 and 2. Intended for high volume apps, but
again, longevity is not important.

So if you are going to consider DSPs for your apps, I expect you would
be looking at the last category. I'm pretty sure I wouldn't be
designing from this group if I wanted to be building this board 10 years
from now though. Have you talked to TI about longevity?

Not yet. That comes if I decide to have a DSP in a project. But mostly
my clients have those discussions because that's their turf, I am more
the analog guys. On large projects stuff gets put in writing about
guaranteed years of supply. On some chips it goes as far as putting the
mask data in escrow, especially with smaller companies where there is a
chance of them going belly-up down the road.
 
It seems the prices have come down in recent years, but still, the parts
I have seen have no Flash. So you need to add in that cost. But the
Sigma parts aren't really general purpose. They are good if you can
make you app fit the DSP design, otherwise they aren't much use. I
pursued them hard a few years ago until an FAE just threw in the towel
and said I couldn't do my app on their part.

Good grief. The issue wasn't to show YOU that YOUR application was
better in a DSP. Like many FGPA weenies, you're trying to sell a part
that has a niche market as the universal hammer.
That makes no sense.

Hammer, meet nail.
There will always be some designs that a given
part is a perfect fit for, but that doesn't mean different devices can't
be compared. The question is what is the best fit for a given job.

That is *NOT* what you're arguing. You're making the general case
that FPGA >> DSP >> uC, which is just silly.
I am hearing some say that FPGAs aren't the best fit and I find they often
are a better fit than an MCU.

Hammer, meet nail.
Much of it has to do with mis-information
about what FPGAs can and can't do and what is required to make them run.
Nonsense.

Just read Joerge's post.

I have.
Much of the stuff he objects to is specific
to the individual devices he has worked with.

Like DSPs. I agree with him. FPGAs aren't in his future. You keep
sugar-coating FPGAs and (erroneously) tear down DSPs. Note that I'm
more of an FPGA kind of guy than a DSP sort but in this case Joerg is
absolutely right. FPGAs only compete in small niche markets and those
where money is no object.
I have not found a big difference in software. The software is
different, but those differences are not important. It all compiles my
HDL fine (mostly because they often use the same third party tool
vendors) and simulation just works anymore.

The software is different in how it works, not what it does. That
difference makes *NO* difference to the end result or the cost of the
product. IOW, it's completely irrelevant. At one time it may have
been important but only in so much as that much of it didn't work
(making the hardware useless).
I don't know what devices you work with, but the ones I use are easy to
program.

Pile on more sugar. You clearly don't work where time is money.
The CPU is the easy part to port, the compiler handles that for you. It
is the drivers for the I/O that is harder.

That's all included in the port. I'm talking from working hardware to
working hardware (the target system not qualified, of course). There
is only about 10% of the code that even has to be looked at.
Their libraries have to have
compatible interfaces and every port is a port.

Wrong. That's all included.
With FPGAs, all you
need to do to switch between brands is normally a new pin list and
timing constraints.

Bullshit! More sugar!
The HDL just compiles to suit the new device.

Oh, you never use libraries? Yet you (erroneously) add that cost into
the DSP/uC bucket.
It has been a while since I ported between brands but it would make sense
if they provide tools to port the timing constraints. That is the only
part that might be any work at all.


Care to elaborate?

You've TOTALLY forgotten about simulation, for instance. That's a
huge effort that you simply sweep under the rug.
You can find a small number of DSPs with CD qualitity CODECs and the
same for MCUs. I know, I did this search recently. I didn't find much
and none that suited my other critera. So the redo of my board will
likely have another FPGA on it.

Goal post shift added to the hammer.
I would appreciate a list of the MCUs/DSPs which have stereo CD quality
CODECs on chip. The Sigma parts from ADI don't count because their DSPs
can *only* be used for certain coding like filters, not general purpose
use.

Sigmas have them. I haven't looked for others.
You are mixing apples and oranges. One manufacturer has many different
families of FPGAs, no? Some are huge power hungry devices that burn a
hole in your board. Others are much lower power and don't burn a hole
in your pocketbook either.

The families all look the same and vary only in density and mix of
memory, speed, MCU, DSP(hmm), and other features.

Good grief, you're arguing both sides.
 
Nonesense. I constantly look for MCU solutions for my designs.

You certainly don't look very hard. I keep looking for FPGA solutions
and haven't found one yet. ;-)
I won't argue that. But I don't consider CPLDs in the same vein as
FPGAs, but you are right, the distinction is blurring.

The architecture is the same. They're the same.

OK, let me ask the question(s) I've asked every one of the FPGA
suppliers; Define FPGA. Define CPLD. They can't. It *IS* marketing.

That is the sort of thinking that is just a pair of blinders. I don't
care if the real estate is "expensive". I care about my system cost.

Part cost ~= system cost. MCUs are so cheap that any soft core is
useless. The development costs are a lot less, too. The tool chains
for the embedded stuff suck.
Gates in an FPGA are very *inexpensive*. If I want to use them for a
soft core CPU that is just as good a use as a USB or SPI interface.

Good grief. Do you have diabetes?
Your opinion. I don't sell 100,000 quantities, so the prices I get at
Digikey are often competitive with the other distis. Certainly they
give you a ball park number for comparison purposes.

If you're buying no more than 1K pieces, you're sorta stuck with the
DigiKeys of the world. I am for prototypes, though I build as many
prototypes as I did in production at my last job. ;-)
The point is that
with FPGAs, *no one* gives you a good price unless you get the
manufacturer involved. That is one down side to FPGAs.

Sure, I'll buy that but it just solidifies the fact that FPGAs really
aren't mainstream components. They are a niche and probably always
will be, unfortunately.
 
I don't see how the equivalent of a TMS320 or a big MSP430 could fit
into one of these small Lattice devices.

BTW, watch the TMS320 5000series parts. The DMA is seriously broken
if you're using the BSP (I2S/TDM interfaces). The McBSP sucks, too,
but that's a different issue. Last I knew they had no intention of
fixing I2S/TDM DMA, either.
 
So for this kind of solution in an FPGA you need a DSP specialist and an
FPGA specialist? That would be a problem.

Pick ones that are in the automotive market. Support for fifteen
years is required.
How long do the usual FPGA stay in the market? Meaning plop-in
replaceable, same footprint, same code, no changes.

The usual? About 30 minutes. ;-)
 
Yes, we've had many discussions about that part ;-).


The engineering analysis is implied: it takes far more silicon to
implement a microprocessor in LUTs than directly in silicon, plus you
lose a lot of speed because of all the additional layers and lookups.

Yes, and add the "markup" for *being* an FPGA (production quantities).
Even more silly. I've done silly things like this but only because of
stupid management edicts (*NOT* based on any engineering analysis).
Yes, the ASIC bypassed the relative inefficiency of doing the same thing
in FPGA's. It would be cool to have some tiny processors like that
available as hard cells on small FPGA's.

Yes, but it would still be quite expensive, compared to a similar
(external) uC, including the I/O.
 
J

Joerg

Pick ones that are in the automotive market. Support for fifteen
years is required.

That's a good point. How does one find out which ones those are?

Mil stuff is even better, that needs to remain available for decades.
That is the reason why the LM331 is still around and why I used it in a
long-life design many years ago.

The usual? About 30 minutes. ;-)

:)
 
R

rickman

Partly I suppose.

Or I could say that my projects so far all require a microcontroller
anyway, and it seemed likely that a separate FPGA was always going to be
more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then,
but still...)

I think what you are saying is that the MCU is a key part of your design
and you use a lot of code in it. Ok, if your emphasis in on using a
commercial MCU that will do the job. But unless your MCU needs are just
too large for something that fits in an FPGA, you have it backwards in
my opinion. Why have both when you can just use an FPGA?

OK, thanks, will check them out.

I haven't gotten a quote on these parts since they were bought by
Lattice. I'd appreciate a pricing update if you get one. They should
be able to do a lot better than the Digikey price, I know Xilinx and
Altera always do. Heck, the Digikey pricing for most FPGAs doesn't go
above qty 1... if nothing else there should be some quantity price breaks.
 
Top