Maker Pro
Maker Pro

DIY CNC Machine How Hard Can It Be?

L

linnix

Fast CNC ..no...
I'll go for quick development over making a speedy CNC machine.

I'm guessing the motor controllers just need xy data for the PC
interface..
For example:
Motor X go to location AC0E 0097
Motor Y go to location 3498 EA33
Motor Z go to location 0000 0000

Somehow, I can't imagine this requiring alot of stand alone
processing..
D from BC

Well, motors are like cavemen. They are not so smart. For a stepper,
there are four coils and you have to turn on A-C for n micro-seconds,
then B-D for another n micro-seconds. You have to keep doing that
until you reach the target. How you find the target is the difficult
part, usually machine visions or some other feedbacks.
 
D

D from BC

Well, motors are like cavemen. They are not so smart. For a stepper,
there are four coils and you have to turn on A-C for n micro-seconds,
then B-D for another n micro-seconds. You have to keep doing that
until you reach the target. How you find the target is the difficult
part, usually machine visions or some other feedbacks.

I haven't look yet..but I'm imagining some smart motor control IC's
might make things easier.
Also..I'm certain I'll end up programming a uC too.
D from BC
 
D

D from BC

Well, suppose you're doing a G01 X1. Y1.2 from part zero,
you're going to have to be telling it an awful lot of stuff to get
that movement smooth.. then consider a G03! (CCW circular
interpolation). That stuff is standard on pretty much the oldest
1960s CNC machines.


Best regards,
Spehro Pefhany

I had to go on wikipedia to look up G code.
I'm really a newbie at this.. :(
http://en.wikipedia.org/wiki/G-code

About XY positioning..
Yeah...It's probably a crazy idea to transfer every micron XY position
from the PC to the motor controllers just to get smooth movement.
It'll be alot of data and it'll have to move fast.
USB comes to mind.
It's to dodge making a motor controller that has to interpret G code.

D from BC
 
S

Spehro Pefhany

Fast CNC ..no...
I'll go for quick development over making a speedy CNC machine.

I'm guessing the motor controllers just need xy data for the PC
interface..
For example:
Motor X go to location AC0E 0097
Motor Y go to location 3498 EA33
Motor Z go to location 0000 0000

Somehow, I can't imagine this requiring alot of stand alone
processing..
D from BC

Well, suppose you're doing a G01 X1. Y1.2 from part zero,
you're going to have to be telling it an awful lot of stuff to get
that movement smooth.. then consider a G03! (CCW circular
interpolation). That stuff is standard on pretty much the oldest
1960s CNC machines.


Best regards,
Spehro Pefhany
 
K

krw

It's not USB that's at fault, it's the OS and all the applications you have
running. Unlike most traditional busses (including PCI), USB has its
"isochronous" mode where you're *guaranteed* a certain bandwidth with bounded
latency.


This is symptomatic of a flaky mouse or possibly a flaky motherboard or driver
software, but says nothing about the USB protocol itself. I remember a
particular PCI VGA card I once had that was so poorly built that in many
systems it simply didn't work at all, and in others it had highly visible
"wavvy lines" in the display... ewwww!


Again, this is really off-base. If you purchase a PCI interface IC such as
those made by PXI, you just use their device and vendor ID and then supply
your own sub-device and sub-vendor ID. I designed a PCI board some years back
around a PXI9054 IC (nice chip!) using nothing more than their data sheets and
the ~$50 Mindhsare book on PCI -- no big $$$ amounts anywhere.

PLX9054? <http://www.plxtech.com/products/io/pci9054.asp>

I used that on a design a few years back too. Worked like a champ.
The documentation was a little confusing in a few places but not too
bad.

Mindshare has some very good books. I made good use of their PCI
book, even though my employer was a member of the "secret society",
so I knew the secret handshake to download the spec directly. The
Mindshare book was far easier to understand.
Oh please. It's far better than we *have* standards such as PCI and USB and
WiFi than everyone running off and creating their own, incompatible interface
"for free" (which isn't free anyway!).

But it's *easier* to make incompatible products! ;-)
Ewwww... noooo!

Agreed. May it stay dead!
ISA doesn't exist anywhere in the standard consumer PC space anymore, and I
think it's even accurate to say that VME has held on better in the
industrial/embedded computing space that ISA!

Not hard. ISA simply sucks.
 
R

Robert Latest

Tim said:
Yes, but you can learn it. Find out the differences between stepper,
brushed DC, and brushless DC motors. I'd probably look at brushless DC
motors, but if you want to roll your own driver life can get complicated.

I've had great success using the brushed DC drivers from geckodrive.com.
Cheap, compact, covering large range of motors, and are virtually
indestructible.

Not associated with them at all, just a satisfied customer.

robert
 
C

colin

D from BC said:
Yup...I'd just have the PC just stream xyz data to the motor
controllers.
The motor control units will have to figure out what to do with the
data.
For a basic prototype, I'll make everything constant velocity. Or
perhaps a velocity based on the PC data rate or step size..
D from BC

I use http://www.st.com/stonline/books/ascii/docs/7616.htm
for my slow motors, they are quite robust, ive only blown one after a fair
bit of abuse,
and to220 fets driven by the usual bootstrapped IR dual hi/lo mosfet
drivers.
was a toss up between that and the integrated 3 phase fet/controller module,
but if i blow one driver or mosfet i dont have to replace the whole thing.

I use a dspic33 to control everything, it also direcly bitmap drives a
1/4vga lcd panel, as it has enough ram.
it pwm controls the fast sinewave BLDC motor, and reads two optical QE
encoders.
im not sure PIC is the way togo but I dont have experience of other modern
ones to compare.
theres things I dont like about them, but I often hear well the others are
no better,
as if that somehow makes it all ok.
my early years were when you had piggy backed eprom versions for prototyping
before comiting to rom,
wich made microwave oven controllers fun as they were about the right length
for 1/4w micrwave antena

You will probably have several coordinates that you will want to move at the
same time,
so each point will probably need all the coordinates you have available, and
a time to get there in or a velocity
as well as spindle speed etc, or laser on/off period.

the micro will have to work out all the points in between that coresponds to
each change of state on all the stepper motors.

I started making something years ago wich relied on dual lead screws to do
the drive and also guide each stage.
im not sure if this would of worked very well as I never finished it,
i decided i was making it too big, after I had got into SM devices.

one idea I had was to use a disc drive with the work mounted on the top
platter, with a magnetorestrictive sensor to sense the angle from a bitfield
impressed onto one of the platters, or opto slots etc,
and the drill mounted on the head actuator.

I have an old 8" disc drive knocking about wich would probably gave me the
idea.

each bit is quite simple, but the overal design is quite a lot of bits. ive
seen many cnc projects on the internet,
its surprising how they are all quite different. the commercial ones all
seem to be more the same.

Colin =^.^=
 
M

MooseFET

I had to go on wikipedia to look up G code.
I'm really a newbie at this.. :(http://en.wikipedia.org/wiki/G-code

About XY positioning..
Yeah...It's probably a crazy idea to transfer every micron XY position
from the PC to the motor controllers just to get smooth movement.
It'll be alot of data and it'll have to move fast.
USB comes to mind.
It's to dodge making a motor controller that has to interpret G code.

You could go to some middle ground. The micro can be trusted to step
off the positions following a rule but the G code can be translated in
the PC to a simpler form.

It doesn't take much of a micro to be able to do this:

The interface from the PC sends 7 numbers for each motion that the
micro has to implement. Numbers are all 32 bit values. The position
numbers have the stepper / encoder value at the MSB end. ie: there
are bits below what would actually cause a motion.

If I was doing it, I would send the numbers in base 16. Perhaps not
as hex but instead as the letters "A" through "P" since that makes the
decoding easier.

The input would look like this

LEAD STUFF X0 X1 X2 Y0 Y1 Y2 COUNT

The micro does this:

variables: StepperX, StepperY, StepChange


while COUNT > 0
COUNT = COUNT -1
StepChanged = False
X1 = X1 + X2
X0 = X0 + X1
if MSB(X0) <> StepperX then
StepperX = MSB(X0)
Xport = StepperX
StepChanged = True
end if

Y1 = Y1 + Y2
Y0 = Y0 + Y1
if MSB(Y0) <> StepperY then
SteperY = MSB(Y0)
Yport = StepperY
StepChanged = True
end if

if StepChanged then
Delay
end if
end while

This allows the micro to do second order curves and sections of
circles etc but doesn't require much in the way of smarts in it.
 
D

D from BC

I use http://www.st.com/stonline/books/ascii/docs/7616.htm
for my slow motors, they are quite robust, ive only blown one after a fair
bit of abuse,
and to220 fets driven by the usual bootstrapped IR dual hi/lo mosfet
drivers.
was a toss up between that and the integrated 3 phase fet/controller module,
but if i blow one driver or mosfet i dont have to replace the whole thing.

I use a dspic33 to control everything, it also direcly bitmap drives a
1/4vga lcd panel, as it has enough ram.
it pwm controls the fast sinewave BLDC motor, and reads two optical QE
encoders.
im not sure PIC is the way togo but I dont have experience of other modern
ones to compare.
theres things I dont like about them, but I often hear well the others are
no better,
as if that somehow makes it all ok.
my early years were when you had piggy backed eprom versions for prototyping
before comiting to rom,
wich made microwave oven controllers fun as they were about the right length
for 1/4w micrwave antena

You will probably have several coordinates that you will want to move at the
same time,
so each point will probably need all the coordinates you have available, and
a time to get there in or a velocity
as well as spindle speed etc, or laser on/off period.

the micro will have to work out all the points in between that coresponds to
each change of state on all the stepper motors.

I started making something years ago wich relied on dual lead screws to do
the drive and also guide each stage.
im not sure if this would of worked very well as I never finished it,
i decided i was making it too big, after I had got into SM devices.

one idea I had was to use a disc drive with the work mounted on the top
platter, with a magnetorestrictive sensor to sense the angle from a bitfield
impressed onto one of the platters, or opto slots etc,
and the drill mounted on the head actuator.

I have an old 8" disc drive knocking about wich would probably gave me the
idea.

each bit is quite simple, but the overal design is quite a lot of bits. ive
seen many cnc projects on the internet,
its surprising how they are all quite different. the commercial ones all
seem to be more the same.

Colin =^.^=

Probably one of the things that makes DIY CNC's differ ant than
commercial units is that "whatever-I-can-get" parts are used for DIY.
Whereas, commercial CNC's machines probably get lots of nice custom
machined parts.
Also, for commercial machines...the electronics engineer may have
wanted to justify working longer (for more bucks) and so made the
design in the most bloated way possible :p
Example: Using any off-the-shelf computer supply would be fine but a
roll-your-own was done.

D from BC
 
D

D from BC

You could go to some middle ground. The micro can be trusted to step
off the positions following a rule but the G code can be translated in
the PC to a simpler form.

It doesn't take much of a micro to be able to do this:

The interface from the PC sends 7 numbers for each motion that the
micro has to implement. Numbers are all 32 bit values. The position
numbers have the stepper / encoder value at the MSB end. ie: there
are bits below what would actually cause a motion.

If I was doing it, I would send the numbers in base 16. Perhaps not
as hex but instead as the letters "A" through "P" since that makes the
decoding easier.

The input would look like this

LEAD STUFF X0 X1 X2 Y0 Y1 Y2 COUNT

The micro does this:

variables: StepperX, StepperY, StepChange


while COUNT > 0
COUNT = COUNT -1
StepChanged = False
X1 = X1 + X2
X0 = X0 + X1
if MSB(X0) <> StepperX then
StepperX = MSB(X0)
Xport = StepperX
StepChanged = True
end if

Y1 = Y1 + Y2
Y0 = Y0 + Y1
if MSB(Y0) <> StepperY then
SteperY = MSB(Y0)
Yport = StepperY
StepChanged = True
end if

if StepChanged then
Delay
end if
end while

This allows the micro to do second order curves and sections of
circles etc but doesn't require much in the way of smarts in it.

Now there's something that's keep me thinking for awhile :)
D from BC
 
J

Joel Kolstad

krw said:

Yep, PLX, not PXI -- my mistake. (We have this National Instruments rack in a
screen room that proudly proclaims "PXI" in a very large letters... must be
rotting my mind...)
I used that on a design a few years back too. Worked like a champ.

Agreed, it's a very nice chip. The chaining scatter-gather DMA was really
nice!
But it's *easier* to make incompatible products! ;-)

Yep, it sure is!

---Joel
 
J

Joel Kolstad

D from BC said:
Probably one of the things that makes DIY CNC's differ ant than
commercial units is that "whatever-I-can-get" parts are used for DIY.
Whereas, commercial CNC's machines probably get lots of nice custom
machined parts.

This is one of the reasons conversions of manual mills to CNC are so popular,
I think -- you already have the means to make custom parts, even if it is
slower initially!
Also, for commercial machines...the electronics engineer may have
wanted to justify working longer (for more bucks) and so made the
design in the most bloated way possible :p

You get a lot of that in bigger companies... there's a "DSP guy" and a
"microcontroller guy" and a "PC interface" guy and the end results uses a lot
more hardware than what's really necessary.
Example: Using any off-the-shelf computer supply would be fine but a
roll-your-own was done.

Actually, a "computer" supply typically wouldn't have the higher voltages
outputs (e.g., 24/28VDC) that many motors would like. Still, there are tons
of off-the-shelf switchers available in pretty much any voltage you want...

---Joel
 
F

Fred Abse

[quoted text muted]

Well, motors are like cavemen. They are not so smart. For a stepper,
there are four coils and you have to turn on A-C for n micro-seconds,
then B-D for another n micro-seconds. You have to keep doing that
until you reach the target. How you find the target is the difficult
part, usually machine visions or some other feedbacks.

You don't want steppers for CNC. They put steps in the work. Slopes and
curves come out all rough.

Nobody "in the business" uses steppers.

DC is easiest and probably cheapest. 3-phase AC permanent magnet
synchronous is mostly current practice.
 
F

Fred Abse

I'm guessing the motor controllers just need xy data for the PC
interface..
For example:
Motor X go to location AC0E 0097
Motor Y go to location 3498 EA33
Motor Z go to location 0000 0000

You need to tell them how fast, too, unless all you want to do is drill
holes, when only the Z feedrate *really* matters.

If you want to mill slopes and curves, velocity is all-important.
 
L

linnix

Well, motors are like cavemen. They are not so smart. For a stepper,
there are four coils and you have to turn on A-C for n micro-seconds,
then B-D for another n micro-seconds. You have to keep doing that
until you reach the target. How you find the target is the difficult
part, usually machine visions or some other feedbacks.

You don't want steppers for CNC. They put steps in the work. Slopes and
curves come out all rough.

Depends on what you want. In my case, all my holes are precisely
spaced and lines are straight. I don't need or want curves, not for
my current project.
Nobody "in the business" uses steppers.

I guess many precision tool makers are not "in the business". That's
why I can't buy the right tool. They keep telling me what I want is
not what I want.
 
L

linnix

And if you want the stepper to be able to go at more than a tiny fraction of
its speed capability, it will get up to a speed where a momentary (well say
millisecond) interruption to the stepping signals will result in it losing
step. Basically, stepper motors cannot stop suddenly without first ramping
down to a low speed.

You can safely extend that to all motors.
 
C

Chris Jones

Jonathan said:
With stepper control, at least, my experience has been that there is a
controlled rate of speed increase (starting to move) and decrease
(slowing down towards the end of a move.)

Jon


And if you want the stepper to be able to go at more than a tiny fraction of
its speed capability, it will get up to a speed where a momentary (well say
millisecond) interruption to the stepping signals will result in it losing
step. Basically, stepper motors cannot stop suddenly without first ramping
down to a low speed. Either you need to always be running at a painfully
low speed, or you need a control computer that does not go AWOL for a
millisecond here and there.

I remember seeing a PC-controlled circuit board milling thing that used
stepper motors. Because it used some M$ operating system or other (can't
remember the exact vintage), sometimes the OS would find something more
important to do ... Step X Step Y Step X Step Y.... See if the network card
is doing anything interesting.... ooh let's try the screen saver... let's
page some virtual memory to disk...where was I ....Step X Step Y.... The
motors would lose step and half of the tracks and holes on the PCB would be
in the right place, and the other half would be shifted off to one side by
a few millimetres, sometimes cutting through the first tracks. Completely
useless. You really need to be in full control of whatever software is
controlling the motors, even if you're only using stepper motors. I
haven't tried that "EMC" software, (supposedly running some real-time linux
variant), but the only other alternative I can see would be to use a
microcontroller for the time-critical stuff.

Chris
 
C

Chris Jones

Joel said:
It's not USB that's at fault, it's the OS and all the applications you
have
running. Unlike most traditional busses (including PCI), USB has its
"isochronous" mode where you're *guaranteed* a certain bandwidth with
bounded latency.

Whilst I expect that you are right that the USB standard if implemented
correctly, would work well, I think that many PC don't implement the
standard very well and peripherals are generally even worse. I guess that
this has probably improved with time as the other interfaces are
disappearing and the PC with broken USB becomes more and more useless, but
some of my first USB experiences have left a very negative impression on
me.

This is symptomatic of a flaky mouse or possibly a flaky motherboard or
driver
software, but says nothing about the USB protocol itself. I remember a
particular PCI VGA card I once had that was so poorly built that in many
systems it simply didn't work at all, and in others it had highly visible
"wavvy lines" in the display... ewwww!


Again, this is really off-base. If you purchase a PCI interface IC such
as those made by PXI, you just use their device and vendor ID and then
supply
your own sub-device and sub-vendor ID. I designed a PCI board some years
back around a PXI9054 IC (nice chip!) using nothing more than their data
sheets and the ~$50 Mindhsare book on PCI -- no big $$$ amounts anywhere.

Thank you for drawing my attention to that IC, and the book. I didn't know
about them.
Oh please. It's far better than we *have* standards such as PCI and USB
and WiFi than everyone running off and creating their own, incompatible
interface "for free" (which isn't free anyway!).
Well you are welcome to that opinion. I think that it *is* good that PCI
and USB and WiFi exist, but I think that it was a BAD THING when the PCI
club decided to take the specification off their website, and started
sending take-down notices to the ISPs of anyone who continued to make
available copies of the standard. To me that says pretty clearly "We don't
want you to use this standard for free", and since I want to do things that
can be done for free, then that makes me not want to use their standard. I
know it won't hurt them measurably but it gives me no pleasure to work on
equipment of which the technical details have been deliberately suppressed
and kept from public knowledge. I'll work on closed systems for money but
then it's just a job and nothing more.
Ewwww... noooo!

ISA doesn't exist anywhere in the standard consumer PC space anymore, and
I think it's even accurate to say that VME has held on better in the
industrial/embedded computing space that ISA!

---Joel
About 10 years ago, with a piece of edge connector hacksawed off a broken
videocard, a bunch of HCT logic chips, a copy of Horowitz and Hill and a
spare weekend, I made a 32-input data acquisition card that enabled me to
gather some data that I needed to gather. I think PCI would have involved
rather more stringent layout and construction requirements, USB would have
involved ordering and waiting for a fancy IC and then would have required a
lot of messing around to get the fast latency that I needed, and VME was
not available on my ex-dumpster PCs. Certainly ISA is not elegant but it
got the job done.

I'll have a look at that PCI chip that you mentioned. A cheap (<$50)
general purpose digital I/O card would be of interest to me.

Chris
 
M

MooseFET

You need to tell them how fast, too, unless all you want to do is drill
holes, when only the Z feedrate *really* matters.

If you want to mill slopes and curves, velocity is all-important.

Hmmmm that points out a problem wit the method I suggested elsewhere.
The rate of motion can't be controled by the PC and it varies by about
40%. Let me see if I can fix that:

The PC needs to send the delay in a form that the micro can handle
without too much trouble.

When either stepper actually moves, we should delay by a PC determined
amount. When both steppers step we want to delay by 1.414 times the
normal amount.

To improve on this, when both steppers change, the delay could be
calulated like this:

Temp = Delay + (Delay SHR 2)
Temp = Temp + (Temp SHR 3)
Delay(Temp)

This would be good to 1/2%.
 
Top