As a non-professional programmer, I started out in the 1970s using Intel 8-bit microprocessors for embedded applications, programming them in assembly. As the years went by, and the personal computer took over the microprocessor market, the internals of the microprocessor became pretty much incomprehensible to me. I am sure the advanced designs were driven more by software considerations (compilers and OSes) than anything else, especially the cozy relationship between Intel and Microsoft that continues to this day.
I gave up trying to use assembly when the Intel 80286 family arrived. The instruction set just became way too complicated. I also pretty much gave up writing embedded program code in assembly, except for the occasional DEC PDP-11 driver for external I/O interfaces... and even there I had DEC open-source driver "templates" to follow (and modify) for a particular operating system. It is absolutely essential to know and understand the OS when making a CPU architecture interface to the real world.
Of course, most embedded applications in the 1970s didn't have or need an OS. You, the hardware engineer, were responsible for everything. But a lot of the advanced microprocessor architecture is now centered around efficient high-level language processing... maybe all of it AFAIK. So, as Bob said, it makes sense to let those who are doing the programming select the architectural platform they are most comfortable/experienced with, unless you have an over-riding hardware need that requires a particular architectural platform.
As for PICs... these were new to me a couple years ago, although they have been around for a looong time, under the name Peripheral Interface Controller, for use with minicomputers and small mainframes. I got involved with them a few years ago while trying to help with the
Flashlight Project. I was attracted by their internal FLASH memory programming, low cost, low power, small footprint, and significant peripheral interface capabilities, both analog and digital. I had played around with Arduino a few years prior, and even had gifted to me, by my electrical engineer son, a small Raspberry Pi, which I got to boot up with a Linux distro but never went any further with it... more power than I usually needed for embedded microprocessor projects, although the Internet of Things (IoT) may eventually change my mind about that.
Back in the day, when PCs were just coming on the market with "IBM compatible" clones proliferating from Asia, I went to work as a newly-minted electrical engineer for a systems development firm that used Digital Equipment Corporation (DEC) minicomputers, replacing an engineer who had started a portable data acquisition project based on the 8-bit Intel 8085 microprocessor. I finished that project and was then asked to "upgrade" a raster-scanning optical micro-densitometer with an H-P plane-mirror laser interferometer interface retrofitted to the X-Y film scanning stage. I knew diddly about PDP-11 minicomputers at that time, so I specified the use of an IBM PC-AT personal computer, with a custom-designed interface to the H-P interferometer. I would design the hardware interface, a technician would wire-wrap the interface boards, and a programmer would roll the code.
Management went through the roof! This was a DEC house, I was told, and you will use a PDP-11/34 running RSX-11M or (possibly) RT--11, but not a "toy" computer, even if it does have IBM on the nameplate. Years later I found out that no one ever got fired for specifying IBM, but I stuck to my guns anyway and insisted on using the PC.
As the only electrical engineer on the staff, their only option was either to replace me or allow me to dig my own grave. So they loaned me a programmer who did his best to break the Microsoft FORTRAN compiler I had purchased for the PC, spending several months on that effort while the technician and I proceeded with the design of the high-speed up/down counters necessary to track the X-Y stage movements and control the stage drive motors. The project was a huge success, I got to publish a (non-peer-reviewed) paper that I presented at a laser conference in Florida the next year, and our customer was quite pleased at their new ability to selectively digitize the density of randomly selected small areas of photographic reconnaissance film, arbitrarily selected from a larger sheet of film. This was back when almost all aerial image collection was performed on photographic film instead of with digital sensor platforms, as it is today. The demise of high-resolution film for overhead imagery no doubt contributed to the demise of Kodak as a large-volume supplier, but some intelligence and reconnaissance agencies held on the bitter end... as did my employer's dedication to obsolete DEC technology, but by then I had moved on.
So don't take it personally that your supervisors don't want to use PICs. Find out what they DO want to use, whether that be Renesas (Intersil) or some other platform, and "go with the flow."
BTW, PICs are
not Mickey Mouse, although they may be popular with some hobbyists, but like any μP they have their strengths and their limitations and you must be aware of both before deciding whether to use one, not to mention which one to use. A 32-bit processor may have some advantages in terms of I/O bandwidth, and perhaps a particular brand may feature a better OS implementation and better software support, but you haven't told us WHY you even NEED a 32-bit processor. If it's because management says so, then you can either accept that or find another career path.