Maker Pro
Maker Pro

USB microscopes for very small SMT

J

Joerg

JW said:
IMO for doing surface mount work, anything over 5x magnification is
overkill. So my question would be what the *minimum* magnification would
be. Additionally, at lower magnifications how much of the work area would
be displayed?


Minimum magnification is 20 but if, as someone in a German NG wrote, it
works at 5" or more away it might just work. Thing is, there is almost
nothing available from the mfg and they did not respond to email.
Usually that is a no-no for me but it seems it's about the only such
product out there in a reasonable price range (where you don't mind if
it gets busted or "lost").

Supposedly it can focus at infinity which would (theoretically) mean I
can place it even farther away. If the LEDs aren't strong enough I'll
just use a separate strong halogen light.
 
J

Joerg

TTman said:
Just got mine... quite good, but the further away the target is, the harder
is is to focus. Quite a bit of backlash on the focus ring too. Tricky at
150mm, easy at 50 mm . 5 out of 10.....

Hey, someone's got it, great! If it really can focus at 150mm and lowest
zoom that'll be fine. Even if tricky and with rickety focus wheel, I'd
just leave it there pretty much forever. Couple of questions:

Can you easily see 0402 or smaller parts at 150mm, good enough to guide
a solder iron?

Where did you order it?

Maybe one can hack it and kludge a better lens onto it. Tried that with
a web cam but that didn't work, the cheapo CMOS sensor in there just did
not cut it.
 
S

Spehro Pefhany

Minimum magnification is 20 but if, as someone in a German NG wrote, it
works at 5" or more away it might just work. Thing is, there is almost
nothing available from the mfg and they did not respond to email.
Usually that is a no-no for me but it seems it's about the only such
product out there in a reasonable price range (where you don't mind if
it gets busted or "lost").

Supposedly it can focus at infinity which would (theoretically) mean I
can place it even farther away. If the LEDs aren't strong enough I'll
just use a separate strong halogen light.

Here's a nice one, but I somehow doubt it fits your budget (especially
since they don't advertise the actual price-- the VHX-600 is probably
well into 5 figures):

http://www.digitalmicroscope.com/



Best regards,
Spehro Pefhany
 
J

Joerg

Spehro said:
Here's a nice one, but I somehow doubt it fits your budget (especially
since they don't advertise the actual price-- the VHX-600 is probably
well into 5 figures):

http://www.digitalmicroscope.com/

That would be way above budget plus those often aren't suited for larger
working distances. I can't get a 16" by 16" circuit board under it but I
have to ;-)
 
K

krw

S/"IBM"/"Bill Gates"
YES! The 64K "feature" on the motherboard (meaningthe RAM was solderd
in).

For good reason, and only if you bought the original 5150 with the
64K motherboard. A 16K motherboard only had 16K soldered in
(Duh!). OTOH, the 64K motherboard was almost impossible to upgrade
beyond 64K, so I don't know why it would matter. The PC-2 and XT
had 256 K motherboards (I don't believe they came with less) that
were easy to upgrade to 640K.
More than enough to run multi-user MP/M, even tho it was a hog.
eating 48K TPA (56K TPA for compile).

My 5150 had 720K. Actually, I still have it but it hasn't been
powered on in 20 years.
 
J

JosephKK

Nope, a doctor who regularly spends more than x minutes per patient will
get replaced :-(

That depends on the particular Doctor primarily, the time they bill
and the time they spend with a client has not been remotely correlated
for decades. If they cannot figure this one out their income level
will be about that of an salaried engineer.
 
J

Joerg

Joel said:
Yes, but I very much suspect that if the bulk of programmers (and hardware
designers) weren't relatively "sloppy" it would actually slow down
technological innovation -- if the guys who managed to fit, e.g., complete
BASIC interpreters into 8kB were still running the show today, I very much
doubt that you'd be able to buy 1TB hard drives for $99 and 16GB USB memory
sticks for $19.99.

Thing is, then we wouldn't need 1TB hard drives. And downloading
thousands of MP3 songs is something our society could live without. I
never downloaded one song and I am still alive.

Software bloat doesn't bother me nearly as much as the fall in software
quality (from the bug-free perspective)...

Even if the programmers exhibited the same level of diligence today as
they did in the 80's, a code that is bloated 100-1000 times in size is
bound to have a huge increase in the number of bugs.
 
J

Joerg

Joel said:
Sure, but my point is that are some "hard" problems out there that are worth
pursuing, and these problems benefit greatly as a direct result of the cheap
CPU cycles and terabytes that come as a result of more "frivolous" needs such
as storing your MP3s, DVD collection, having an animated paper clip ask you if
you're writing a suicide note, etc. :) For instance, this Christmas season
there are several $99 (and below) GPS navigation units available. Now, at
least to me, GPS navigation units do address a worthy problem -- helping
people get from point A to point B more easily and with fewer errors.


Sure, but: I know lots of people who have either unlearned or never
bothered to learn map reading. And then the batteries in the GPS unit
quit ...

It's like garage door openers. Power went, neighbor called "HELP! Can't
get my car out but need to go, now!". Went over there, pulled little
rope, clunk, rolled up door. "Wow, I didn't know this was possible!"
Well, I grew up in Europe where there are hardly any Genie drives on
garage doors.

Same with computers. Us older guys look at a circuit, kind of mush
inductors and caps into fuzzy math in our brains and conclude that it
ain't gonna work with those values, all within two sips of coffee or so.
Younger lad comes in, scratches head, says he'll simulate that. A half
hour later he sticks his head into the lab. "We need to increase L5 to
18.63 microhenry" ... "Ahm, we already stuck a 22uH in there and it's
fixed" ... "Oh".

Internally the map database for these units is often on the order of a
gigabyte and you've got some ~hundreds of MIPS CPU running the show, driving a
color LCD touchscreen and often having a Bluetooth phone interface as well.
Without the sort of "quick and sloppy" programming techniques that drove the
demand for cheap but fast CPUs and large storage devices, it'd probably be
another decade before the price for something like this would be under $10k.

And then this wonderful map sends you out on a "shortcut". Suddenly
after a while of four-wheeling you run out of road. Happened to a
neighbor. Or take UPS. After two failed delivery attempts showed up in
tracking they said "According to our map system the street and address
both don't exist" ... "Ahm, they do, since about 35 years" ... "I'll be
darn!"

Well, I'd agree it's likely to have an increase, although I can't really say
"just how huge" that increase would be. :) Some people adopt the rather
pessimistic view that there are so-many bugs per lines of code, so the more
lines of code... the more bugs. Personally I think that with good software
development techniques, for the *overall product*, the number of bugs per
lines of code should drop as the LOC increases, because the overall product
isn't going to exercise every possible execution path and thereby be hit with
every bug.

I have adopted that pessimistic view, for the most part. Larger software
projects add one more layer of risk: Management. I often found that to
be a serious problem. Discipline is another matter. I can't even count
the cases of memory hogging anymore. "Could it be that you allocated a
huge swath of megabytes and never let go?" ... "Oh dang, that could be it".

Those are all things that acutally happened ;-)
 
J

Joerg

John said:
Music is the newest narcotic.

Maybe I missed out on a lot in life. Never did drugs so I don't know how
a narcotic feels ;-)

But I do need my daily dose of Country/Bluegrass/Americana in the
evenings. Always did.
 
J

Joerg

Jan said:
Yes, much increase in size comes from using libraries.
Sometimes where there is no need for one at all.
Examples is Qt4 in Linux, it is huge, while for many if not most
applications you could use for example xforms, that is very very small (I use nothing else).
And other big joke is if people start using SDL for every application that has audio
and or video.
But the advantage of the libraries is that they are so often used that they are pretty much bug free.
And they *can* add very powerful features.
So indeed the increase of bugs may not be that big.
If I was to code a formatted print in asm, or use C and libc with printf(),
then clearly the last method is what gives least bugs (and development time).

It is the same with hardware I suppose, I re-use circuits I have designed and tested in the past.
that saves time and increases reliability, although perhaps in some cases
you could, if you started from scratch, save a transistor....
In fact all integrated circuits are like software libraries, you may not
use all their features, but they provide instants solutions.
I do still remember designing my own video ADC in the seventies..
It is only because of the ever smaller transistor sizes that it has not become 'hardware bloat'.


IMHO one of the more sorry examples is TLAVu for Tektronix logic
analyzers. Using it right now. Slow as a tortoise, file sizes >3MB,
panning on a dual-core with 2GB of RAM takes several seconds, plus the
occasional crash. Man, the output of my old Dolch in a DOS window could
run circles around that.
 
J

Joerg

Joel said:
I think the sad thing is that there's a good chance the programmers of that
software are *skilled* enough that it could be fast, use smaller files, and
crash-free... but management has decided that those outcomes are not worth
pursuing.

Not sure about that. A decent programmer would quit should this happen a
lot. I would. I cannot and could never bring myself to designing sub-par
product.

This often happens when the managers don't have enough of a technical
background to *know* that what their charges are producing is sub-par.

It happens when they fail to talk to the group of people they should
communicate with the most: Customers.

Maybe you'd like to move back into technical management in another decade
rather than retiring, Joerg? :)

Actually I enjoyed those phases. Had to do it twice so far, the second
time with full P&L responsibility and such. I would not mind doing it
again, maybe some kind of "salvage case". I do get asked at times but
the problem with self-employed folks like myself is that I cannot simply
pack up and move. Can't alienate clients. More and more seem to depend
on outside skills for analog these days because they either can't find
any or don't want to have it in house. That was a huge problem last time
I took over the helm somewhere. For one company I was the only "go to"
guy when it came to analog and discrete stuff. Now it's 3-4 companies
out of roughly a dozen.
 
K

krw

Thing is, then we wouldn't need 1TB hard drives. And downloading
thousands of MP3 songs is something our society could live without. I
never downloaded one song and I am still alive.

I've never downloaded a song either but I have my black vinyl on
disk, not to mention digital photos, etc. There are other reasons
for performance other than bloat. Not to mention that people
really do want that "bloat".
Even if the programmers exhibited the same level of diligence today as
they did in the 80's, a code that is bloated 100-1000 times in size is
bound to have a huge increase in the number of bugs.

They'd be bankrupt.
 
K

krw

Sure, but: I know lots of people who have either unlearned or never
bothered to learn map reading. And then the batteries in the GPS unit
quit ...

Don't blame GPSs, or any technology, for the lack of map reading
skills when schools and parents are truly guilty here.
It's like garage door openers. Power went, neighbor called "HELP! Can't
get my car out but need to go, now!". Went over there, pulled little
rope, clunk, rolled up door. "Wow, I didn't know this was possible!"
Well, I grew up in Europe where there are hardly any Genie drives on
garage doors.

They never RTFM. That's not technology. Again, put blame where
blame belongs.
Same with computers. Us older guys look at a circuit, kind of mush
inductors and caps into fuzzy math in our brains and conclude that it
ain't gonna work with those values, all within two sips of coffee or so.
Younger lad comes in, scratches head, says he'll simulate that. A half
hour later he sticks his head into the lab. "We need to increase L5 to
18.63 microhenry" ... "Ahm, we already stuck a 22uH in there and it's
fixed" ... "Oh".

That one you can blame on technology. Since calculators, I've
completely lost any ability to estimate values by quick inspection.
Don't need Spice for most things, other than it's very nice for
documentation.
And then this wonderful map sends you out on a "shortcut". Suddenly
after a while of four-wheeling you run out of road. Happened to a
neighbor. Or take UPS. After two failed delivery attempts showed up in
tracking they said "According to our map system the street and address
both don't exist" ... "Ahm, they do, since about 35 years" ... "I'll be
darn!"

What makes you think their paper map would be any better? No, GPSs
aren't perfect and have sent me in the wrong direction more than
once, but paper maps aren't perfect, particularly at the local road
level. Most maps don't show interchange detail either. GPSs are a
big win for long distance driving.
I have adopted that pessimistic view, for the most part. Larger software
projects add one more layer of risk: Management. I often found that to
be a serious problem. Discipline is another matter. I can't even count
the cases of memory hogging anymore. "Could it be that you allocated a
huge swath of megabytes and never let go?" ... "Oh dang, that could be it".

So your solution is to not have large projects? ...to not attack
large problems?
Those are all things that acutally happened ;-)

Of course they have. Look at the ATCs. Fixing management fixes a
lot more than software though.
 
J

Joerg

Joel Koltner wrote:


[...]

Cool... I'm sure you'll continue to have many offers if you decide to move out
of active circuit development/troubleshooting one day. :)

Move out? I don't think I'll do that in this here earthly life. They'll
have to pry the old Weller iron out of my cold dead hands.

When I was da boss and held the first "state-of-the-union" style meeting
(where everyone is present) I told the people that when they had trouble
with hardware they should ask me. And that I would never hold it against
anyone because he or she didn't know something.
 
J

Joerg

krw said:
Don't blame GPSs, or any technology, for the lack of map reading
skills when schools and parents are truly guilty here.

Sure, the problem always never lies with the kids but their mentors.
IMHO parents should take a pivotal role here and not dump all that
responsibility on a school.

They never RTFM. That's not technology. Again, put blame where
blame belongs.

Yep. This was meant to explain how technology makes us complacent. I've
recently re-learned the way our great-great-grandparents made break,
over a wood fire. Not by turning a knob on the oven and later shoving
the dough into it.

That one you can blame on technology. Since calculators, I've
completely lost any ability to estimate values by quick inspection.
Don't need Spice for most things, other than it's very nice for
documentation.

It's good to grab a slide rule once in a while. I recently upgraded to a
round Scientific Instruments No.250, from a friend who passed away.
Unfortunately the hot summers out here have shrunk its naugahide pouch
and it no longer fits into it.

What makes you think their paper map would be any better? No, GPSs
aren't perfect and have sent me in the wrong direction more than
once, but paper maps aren't perfect, particularly at the local road
level. Most maps don't show interchange detail either. GPSs are a
big win for long distance driving.

The maps from this area show the street quite clearly. All of them. But
yeah, if I were a trucker I'd also have GPS. With the emphasis on "also".

So your solution is to not have large projects? ...to not attack
large problems?

Oh no, I am involved in quite a few large projects and thoroughly enjoy
them because I get to work in teams. But those are all well managed. I
fail to see that level of management competence on some software projects.

Of course they have. Look at the ATCs. Fixing management fixes a
lot more than software though.


Many engineers think managers are non-essential. Thing is, they are very
essential. The head of R&D at my first employer was a radio/TV tech by
trade. He often didn't have the foggiest what we were doing but he knew
he could trust us and we knew we could trust him. Excellent manager.
 
J

Joerg

Joel said:
Hmm... if only Christianity believed in reincarnation... ;-)

Won't happen :)

But I recently asked our pastor whether there was good quality beer in
heaven. He said "I sure hope so!". Maybe they've got Internet, too.
Online is fantastic, I needed to do some year-end biz purchasing and
could use one more solder station because that darn digital one croaked
again. No more digital irons here from now on unless I get to design the
electronics myself. Just ordered a Weller WES-51, 92 bucks at Amazon.
Unbelievable. Oh, I also added a Samsung NC-10 netbook (thanks for the
hint, Howard!).
 
J

Joerg

krw said:
Joel Koltner wrote:
[...]
Even if the programmers exhibited the same level of diligence today as
they did in the 80's, a code that is bloated 100-1000 times in size is
bound to have a huge increase in the number of bugs.

They'd be bankrupt.

Well, when people like me start looking and shopping for older versions
of Acrobat or Works or whatever that would disturb me if I was a manager
there.

Take a look at Vista: IMHO a blatant slap in the face for the designers.
Clients of mine only buy PCs when they come with XP or the right to
downgrade. Until a few minutes ago (when I placed the order) I was in
the market for a netbook -> All XP. Else I wouldn't have bought one.
 
J

Joerg

Joel said:
I'll cross my fingers. :)


Cool, let us know how it works for you. Over Thanksgiving there the only
netbook I found in a store to play with was the HP Mini 1000, and I was
surprised with how good the keyboard worked given the size. Still, I'm not
quite sold on them yet for my own needs. I really wish that the old HP 2133
wasn't the only machine with a 1280x768 pixel display... 1024x600 (as found on
all the other netbooks) just seems a little too limiting. Other than that,
the Dell Mini 9 looks quite attractive... what made you decide on the NC-10 in
particular?

I was eyeing the Dell Mini 9 at first, then some others. The Samsung won
hands down in battery runtime. I wanted a time machine transfer back
into the early 90's where my old Compaq Contura gave me 6h out of
old-tech NiCd. The Samsung clocked in at 7-8h in tests. That cinched it
for me.

With some battery saver tricks and hibernate mode I should be able to
use that at clients all day long without having to plug in. Of course,
for fire-ups of new (larger) designs I'll still schlepp the heavy
Durabook because its larger screen makes Gerber and schematic viewing
much easier. That Durabook has been very good to me over the years. It
still turns heads because it has that nice macho DoD look.

Also ordered a Prologix adapter at Sewell so I don't have to use the
digital camera to get a screen shot off of older HP equipment at
clients. Hopefully. We'll see how easy it is to get that local
button-press plot intended for the old HP Think-Jets into the laptop or
netbook. This would also save time.
 
J

Joerg

Joel said:
Are you familiar with John Miles' GPIB tools here -->
http://www.thegleam.com/ke5fx/gpib/readme.htm ? For free they're quite
good -- I've used them for a number of years now (I can vouch that they work
with HP 875x network analyzers and 859x spectrum analyzers), and he added
support for the Prologix devices not too long ago now.

Yes, when the Prologix arrives I want to try out the 7470.exe routine.
 
Top