Maker Pro
Maker Pro

Choosing a 32 Bit MCU

Ok so I know this is a very subjective topic and probably one that has been discussed at length. But I have a few (hopefully) more specific questions.

A bit of background first. I have been designing electronic products for 25+ years professionally, for various companies. In this time frame an 8 bit controller has been sufficient to cover all I have needed to do. Started off with Motorola and Intel parts before starting to use PICs circa 1996 (16C then 18F) which I still use today. Obviously in this time I have gotten to know not just the devices well but also Microchip as a company. I have found their technical support to be good, their literature, tools and software. To this end I have never felt the need to look elsewhere.

Now I have a project that requires a 32 Bit device, straight off I thought PIC32. However this has raised eyebrows from some of my overseas colleagues (I'm not being specific here) but they exclusively use Reneseas stuff. I have considered a 32 Bit Arm device but can't see what it would offer me that the Pic 32 won't?

Details wise the project is not power sensitive, nor does it need to be the ultimate in processing engines!

I have had 8 Bit designs rejected in the past because of using a PIC.

Am I missing something, are PIC devices seen as a bit Mickey mouse / hobbyist? Would it be a mistake to use one?

Thanks for your input!
 
If the hat fits........

Rejection of a design is usually on a cost and/or reliability basis - maybe even code security and potentially security of supply - but if all those criteria are met then I don't know why any development department would reject one - didn't they tell you why?

Do you (does the project) need 32-bits?

I'd line up the system development requirements - I/O, A-D/D-A, communications etc and see which devices fit with them then consider other factors such as availability, lifespan, security, programming and cost (not necessarily in that order) and narrow the list down to (say) half a dozen.

Then ask the next level of decision makers which they prefer. If they have the 'ability' to refuse a design for reasons-unknown then they should have the ability to pass back their preference for device selection.
 
If your customer is programming the device it makes sense for them to specify the type of processor, otherwise, not so much.

I read a comparison a few months ago of pretty much all the microcontroller families which tested them for performance and power usage using the same algorithms. PICs came in last in performance but best in performance / watt. So they are really good when battery life is more important than speed.

I happen to be a PIC user merely because what got me started was a design using a PIC for a NICad charger. I have stuck with them and been very satisfied. I tend to use the 16-bit PICs now because I need more data and program space than the 8 bit ones have. Have not tried PIC32.

Bob
 

hevans1944

Hop - AC8NS
As a non-professional programmer, I started out in the 1970s using Intel 8-bit microprocessors for embedded applications, programming them in assembly. As the years went by, and the personal computer took over the microprocessor market, the internals of the microprocessor became pretty much incomprehensible to me. I am sure the advanced designs were driven more by software considerations (compilers and OSes) than anything else, especially the cozy relationship between Intel and Microsoft that continues to this day.

I gave up trying to use assembly when the Intel 80286 family arrived. The instruction set just became way too complicated. I also pretty much gave up writing embedded program code in assembly, except for the occasional DEC PDP-11 driver for external I/O interfaces... and even there I had DEC open-source driver "templates" to follow (and modify) for a particular operating system. It is absolutely essential to know and understand the OS when making a CPU architecture interface to the real world.

Of course, most embedded applications in the 1970s didn't have or need an OS. You, the hardware engineer, were responsible for everything. But a lot of the advanced microprocessor architecture is now centered around efficient high-level language processing... maybe all of it AFAIK. So, as Bob said, it makes sense to let those who are doing the programming select the architectural platform they are most comfortable/experienced with, unless you have an over-riding hardware need that requires a particular architectural platform.

As for PICs... these were new to me a couple years ago, although they have been around for a looong time, under the name Peripheral Interface Controller, for use with minicomputers and small mainframes. I got involved with them a few years ago while trying to help with the Flashlight Project. I was attracted by their internal FLASH memory programming, low cost, low power, small footprint, and significant peripheral interface capabilities, both analog and digital. I had played around with Arduino a few years prior, and even had gifted to me, by my electrical engineer son, a small Raspberry Pi, which I got to boot up with a Linux distro but never went any further with it... more power than I usually needed for embedded microprocessor projects, although the Internet of Things (IoT) may eventually change my mind about that.

Back in the day, when PCs were just coming on the market with "IBM compatible" clones proliferating from Asia, I went to work as a newly-minted electrical engineer for a systems development firm that used Digital Equipment Corporation (DEC) minicomputers, replacing an engineer who had started a portable data acquisition project based on the 8-bit Intel 8085 microprocessor. I finished that project and was then asked to "upgrade" a raster-scanning optical micro-densitometer with an H-P plane-mirror laser interferometer interface retrofitted to the X-Y film scanning stage. I knew diddly about PDP-11 minicomputers at that time, so I specified the use of an IBM PC-AT personal computer, with a custom-designed interface to the H-P interferometer. I would design the hardware interface, a technician would wire-wrap the interface boards, and a programmer would roll the code.

Management went through the roof! This was a DEC house, I was told, and you will use a PDP-11/34 running RSX-11M or (possibly) RT--11, but not a "toy" computer, even if it does have IBM on the nameplate. Years later I found out that no one ever got fired for specifying IBM, but I stuck to my guns anyway and insisted on using the PC.

As the only electrical engineer on the staff, their only option was either to replace me or allow me to dig my own grave. So they loaned me a programmer who did his best to break the Microsoft FORTRAN compiler I had purchased for the PC, spending several months on that effort while the technician and I proceeded with the design of the high-speed up/down counters necessary to track the X-Y stage movements and control the stage drive motors. The project was a huge success, I got to publish a (non-peer-reviewed) paper that I presented at a laser conference in Florida the next year, and our customer was quite pleased at their new ability to selectively digitize the density of randomly selected small areas of photographic reconnaissance film, arbitrarily selected from a larger sheet of film. This was back when almost all aerial image collection was performed on photographic film instead of with digital sensor platforms, as it is today. The demise of high-resolution film for overhead imagery no doubt contributed to the demise of Kodak as a large-volume supplier, but some intelligence and reconnaissance agencies held on the bitter end... as did my employer's dedication to obsolete DEC technology, but by then I had moved on.

So don't take it personally that your supervisors don't want to use PICs. Find out what they DO want to use, whether that be Renesas (Intersil) or some other platform, and "go with the flow."

BTW, PICs are not Mickey Mouse, although they may be popular with some hobbyists, but like any μP they have their strengths and their limitations and you must be aware of both before deciding whether to use one, not to mention which one to use. A 32-bit processor may have some advantages in terms of I/O bandwidth, and perhaps a particular brand may feature a better OS implementation and better software support, but you haven't told us WHY you even NEED a 32-bit processor. If it's because management says so, then you can either accept that or find another career path.
 
Top