Maker Pro
Maker Pro

Motor speed control via back-EMF detection

P

Phil Pemberton

Hi guys,
I've been tearing apart a Brother PT-1000 label printer with the eventual
goal being to connect it to a desktop PC to print labels for my various
component storage boxes. I've figured out how the print head communicates with
the controller board, which just leaves the motor drive circuitry.

Naturally, being a low cost device (£15), the PT-1000 doesn't use anything
remotely stepper-motor-like for the label feed. Instead, it uses a cheap
Mabuchi DC pancake motor (an RF-300C-11440, for which I have yet to find a
datasheet) and a speed reduction gearbox. The speed control is performed by a
ROHM BA6220 chip.

Ideally I'd like to eliminate the ROHM chip (seeing as it's basically
unobtainium) and use a PIC of some description (probably a 12F675) to do PWM
speed control of the motor. But first I'd like to understand how the ROHM chip
manages to do what it does.

I found a datasheet for the '6220 here:
<http://www.classiccmp.org/rtellason/chipdata/ba6220.pdf>
.. but like most ROHM datasheets, it doesn't say much about the chip, other
than that it uses back-EMF sensing and how to determine one of the two
external resistor values.

What I don't get is that there's no obvious way for the chip to sense
back-EMF. Everything I've been able to find about bEMF sensing suggests that
it's normally used with PWM control -- the motor is powered up for a short
period of time, then in the off period the voltage across the motor is sampled
and used to (roughly) determine the motor speed. Unless it's sensing current,
but if it is, the "application circuit" (BA6220 datasheet, Fig. 2, page 2)
doesn't look like any current sensing circuit I've ever seen. In fact, it
looks like a voltage comparator, but the polarity of the voltage reference
doesn't look quite right...

I've hooked the scope up to the driver IC's pins (and the motor itself) and
didn't see anything that suggested the driver IC was switching the power on
and off. In fact, the voltage remained more or less constant, excepting the
~50mV sine wave (plus one ~200mV spike per cycle) modulation that I suspect is
being caused by the motion of the commutator relative to the brushes.

Can anyone shed some light on this?

Like I said, I'm getting rid of this thing anyway, but I'd rather like to
understand how the existing circuit works first, if at all possible...

Thanks,
 
J

Jamie

Phil said:
Hi guys,
I've been tearing apart a Brother PT-1000 label printer with the
eventual
goal being to connect it to a desktop PC to print labels for my various
component storage boxes. I've figured out how the print head
communicates with
the controller board, which just leaves the motor drive circuitry.

Naturally, being a low cost device (£15), the PT-1000 doesn't use
anything
remotely stepper-motor-like for the label feed. Instead, it uses a cheap
Mabuchi DC pancake motor (an RF-300C-11440, for which I have yet to find a
datasheet) and a speed reduction gearbox. The speed control is performed
by a
ROHM BA6220 chip.

Ideally I'd like to eliminate the ROHM chip (seeing as it's basically
unobtainium) and use a PIC of some description (probably a 12F675) to do
PWM
speed control of the motor. But first I'd like to understand how the
ROHM chip
manages to do what it does.

I found a datasheet for the '6220 here:
<http://www.classiccmp.org/rtellason/chipdata/ba6220.pdf>
.. but like most ROHM datasheets, it doesn't say much about the chip,
other
than that it uses back-EMF sensing and how to determine one of the two
external resistor values.

What I don't get is that there's no obvious way for the chip to sense
back-EMF. Everything I've been able to find about bEMF sensing suggests
that
it's normally used with PWM control -- the motor is powered up for a short
period of time, then in the off period the voltage across the motor is
sampled
and used to (roughly) determine the motor speed. Unless it's sensing
current,
but if it is, the "application circuit" (BA6220 datasheet, Fig. 2, page 2)
doesn't look like any current sensing circuit I've ever seen. In fact, it
looks like a voltage comparator, but the polarity of the voltage reference
doesn't look quite right...

I've hooked the scope up to the driver IC's pins (and the motor
itself) and
didn't see anything that suggested the driver IC was switching the power on
and off. In fact, the voltage remained more or less constant, excepting the
~50mV sine wave (plus one ~200mV spike per cycle) modulation that I
suspect is
being caused by the motion of the commutator relative to the brushes.

Can anyone shed some light on this?

Like I said, I'm getting rid of this thing anyway, but I'd rather
like to
understand how the existing circuit works first, if at all possible...

Thanks,
Its call armature feed back..
DC motors generate energy like DC generators would..
keeping the DC voltage constant under variable loads does not mean
the motor will be perfectly stable how ever, it'll
come close under normal loading conditions.

All the controller is doing is maintaining the voltage at the desired
set point and while the motor is loading on this IC, it simply provides
voltage to the motor when needed (under loading conditions) and removes
voltage from the circuit when the motor unloads (shaft torque is
lowering and DC motor is now producing regenerating energy which adds to
the voltage).


http://webpages.charter.net/jamie_5"
 
B

Bob Eld

Phil Pemberton said:
Hi guys,
I've been tearing apart a Brother PT-1000 label printer with the eventual
goal being to connect it to a desktop PC to print labels for my various
component storage boxes. I've figured out how the print head communicates with
the controller board, which just leaves the motor drive circuitry.

Naturally, being a low cost device (£15), the PT-1000 doesn't use anything
remotely stepper-motor-like for the label feed. Instead, it uses a cheap
Mabuchi DC pancake motor (an RF-300C-11440, for which I have yet to find a
datasheet) and a speed reduction gearbox. The speed control is performed by a
ROHM BA6220 chip.

Ideally I'd like to eliminate the ROHM chip (seeing as it's basically
unobtainium) and use a PIC of some description (probably a 12F675) to do PWM
speed control of the motor. But first I'd like to understand how the ROHM chip
manages to do what it does.

I found a datasheet for the '6220 here:
<http://www.classiccmp.org/rtellason/chipdata/ba6220.pdf>
.. but like most ROHM datasheets, it doesn't say much about the chip, other
than that it uses back-EMF sensing and how to determine one of the two
external resistor values.

What I don't get is that there's no obvious way for the chip to sense
back-EMF. Everything I've been able to find about bEMF sensing suggests that
it's normally used with PWM control -- the motor is powered up for a short
period of time, then in the off period the voltage across the motor is sampled
and used to (roughly) determine the motor speed. Unless it's sensing current,
but if it is, the "application circuit" (BA6220 datasheet, Fig. 2, page 2)
doesn't look like any current sensing circuit I've ever seen. In fact, it
looks like a voltage comparator, but the polarity of the voltage reference
doesn't look quite right...

I've hooked the scope up to the driver IC's pins (and the motor itself) and
didn't see anything that suggested the driver IC was switching the power on
and off. In fact, the voltage remained more or less constant, excepting the
~50mV sine wave (plus one ~200mV spike per cycle) modulation that I suspect is
being caused by the motion of the commutator relative to the brushes.

Can anyone shed some light on this?

Like I said, I'm getting rid of this thing anyway, but I'd rather like to
understand how the existing circuit works first, if at all possible...

Thanks,
--
Phil.
[email protected]
http://www.philpem.me.uk/
If mail bounces, replace "08" with the last two digits of the current
year.

The motor current is sensed inside the 6620 chip with a 20:1 current mirror.
Though not completely clear because actual components are left out of the
block diagram, there are two transistors being driven from the internal
amplifier. They are the main parts of the current mirror. The motor current
goes to the collector of one transistor and one twentieth of that current
flows in the other transistor. The current ratio is established by the
mirror geometry from the ratio of collector areas in the chip.

This sensed current with the motor voltage is applied to the internal op-amp
in a way that forms a back emf bridge. The bridge subtracts the motor
driving voltage from the current generated voltage leaving only the back emf
which is proportional to speed. This can be shown with some simple algebra.

This bridge concept allow DC motor back emf and thus speed to be sensed in
the steady, DC state.
 
N

nospam

Phil Pemberton said:
Like I said, I'm getting rid of this thing anyway, but I'd rather like to
understand how the existing circuit works first, if at all possible...

You put a constant voltage across a dc motor it spins at a speed where the
back emf matches the voltage. I suppose you could call that 'back-emf'
detection. The problem is additional 'back-emf' from armature resistance
which depends on armature current and so motor load.

The voltage across the motor is tapped down by Rs and Rt and the opamp in
the chip servos that voltage to be equal to Vref. Additionally the 20:1
current mirror in the chip pulls 1/20th of the motor current through Rt
which causes the voltage across the motor to be increased by the amount
that current increases the voltage across Rt. If Rt is 20 times the motor
armature resistance it more or less matches the 'back-emf' from the
armature resistance.

If you are pulling one apart presumably it has one of these chips so why
would you want to replace it?

--
 
P

Philip Pemberton

nospam said:
You put a constant voltage across a dc motor it spins at a speed where the
back emf matches the voltage. I suppose you could call that 'back-emf'
detection. The problem is additional 'back-emf' from armature resistance
which depends on armature current and so motor load.

So in effect..
* More load on the motor shaft means more current drawn by the motor (the
energy has to come from somewhere).

* Ohm's Law states that because the current has increased, either the
voltage has increased or the resistance of the armature winding has decreased.

* Assumption: because the drive circuit is holding the voltage steady, the
resistance must have changed.

But if the current has increased and the voltage hasn't changed, won't the
increased current / reduced resistance balance things out?
The voltage across the motor is tapped down by Rs and Rt and the opamp in
the chip servos that voltage to be equal to Vref. Additionally the 20:1
current mirror in the chip pulls 1/20th of the motor current through Rt
which causes the voltage across the motor to be increased by the amount
that current increases the voltage across Rt. If Rt is 20 times the motor
armature resistance it more or less matches the 'back-emf' from the
armature resistance.

But if you're providing a constant voltage, then the current wouldn't matter,
would it?

I know the chip is acting as a current amp (first thing I learned about BJTs
is that they're current amplifiers -- Ic = Ib * Hfe in an ideal world). So is
all the opamp / Vref circuitry really just there to hold the motor voltage steady?

If that's the case, why not just use an LM317 or similar voltage regulator?
If you are pulling one apart presumably it has one of these chips so why
would you want to replace it?

Because I can't find the BA6220 in any of my suppliers' catalogues -- it seems
to be one of those odd parts that exists, but only in China and the
surrounding area...

I'm rebuilding the entire drive circuit for the printer mechanism so I can use
it to print labels from the PC; adding a motor speed controller at the same
time wouldn't take much longer. I can rig up a PWM controller with a PIC micro
and a MOSFET, then monitor the back-EMF with the A/D converter when the PWM
output is off.

I think I need to get a couple of motors and a current shunt out of my junk
box and do some tests...

Thanks.
 
P

Philip Pemberton

So in effect..
* More load on the motor shaft means more current drawn by the motor (the
energy has to come from somewhere).

* Ohm's Law states that because the current has increased, either the
voltage has increased or the resistance of the armature winding has decreased.

* Assumption: because the drive circuit is holding the voltage steady, the
resistance must have changed.

But if the current has increased and the voltage hasn't changed, won't the
increased current / reduced resistance balance things out?

But if you're providing a constant voltage, then the current wouldn't matter,
would it?

I know the chip is acting as a current amp (first thing I learned about BJTs
is that they're current amplifiers -- Ic = Ib * Hfe in an ideal world). So is
all the opamp / Vref circuitry really just there to hold the motor voltage steady?

If that's the case, why not just use an LM317 or similar voltage regulator?

Because I can't find the BA6220 in any of my suppliers' catalogues -- it seems
to be one of those odd parts that exists, but only in China and the
surrounding area...

I'm rebuilding the entire drive circuit for the printer mechanism so I can use
it to print labels from the PC; adding a motor speed controller at the same
time wouldn't take much longer. I can rig up a PWM controller with a PIC micro
and a MOSFET, then monitor the back-EMF with the A/D converter when the PWM
output is off.

I think I need to get a couple of motors and a current shunt out of my junk
box and do some tests...

Thanks.
 
J

John Devereux

Philip Pemberton said:
So in effect..
* More load on the motor shaft means more current drawn by the motor (the
energy has to come from somewhere).

* Ohm's Law states that because the current has increased, either the
voltage has increased or the resistance of the armature winding has decreased.

* Assumption: because the drive circuit is holding the voltage steady, the
resistance must have changed.

But if the current has increased and the voltage hasn't changed, won't the
increased current / reduced resistance balance things out?

John Larkin already explained but here is how I think of it (for DC
drive).

Model the motor as a battery (back emf proportional to speed) in
series with a winding resistance (fixed). The current through this
circuit results in a proportional torque.

An ideal motor would have a winding resistance of zero. If you applied
a voltage it would accelerate with infinite torque, taking infinite
current, until the back emf (speed) equalled the applied
voltage. (High quality motors "nearly" do this; you get very high
torque and current demand for a step change in applied voltage).

The actual winding resistance means that some voltage is dropped
across it when the motor is taking more current (i.e. it is being
loaded up). So it slows down proportional to the load. This can be
fixed by designing a supply with *negative* resistance, such that it
increases its output voltage when the current drawn from it
increases. This cancels the effect of the winding resistance and in
the ideal results in a constant speed with load.

This circuit shows the principle:
<http://focus.tij.co.jp/jp/lit/an/sboa043/sboa043.pdf>.
 
N

nospam

Philip Pemberton said:
So in effect..
* More load on the motor shaft means more current drawn by the motor (the
energy has to come from somewhere).

* Ohm's Law states that because the current has increased, either the
voltage has increased or the resistance of the armature winding has decreased.

* Assumption: because the drive circuit is holding the voltage steady, the
resistance must have changed.

The motor is not a resistor it is a voltage generator. It spins at a speed
where the voltage generated matches the voltage you applied.

The motor also has some armature resistance which is independent of speed.
When load and so current increases the ohms law voltage generated across
this fixed resistance is subtracted from the voltage you applied, the motor
slows so the generated (back-emf) voltage matches the remainder.

The circuit in the chip increases the applied voltage in proportion to the
motor current to compensate for the ohms law voltage generated across the
armature resistance. It is a negative resistance circuit where the negative
resistance cancels out the speed independent armature resistance.

--
 
B

Bob Eld

Philip Pemberton said:
So in effect..
* More load on the motor shaft means more current drawn by the motor (the
energy has to come from somewhere).

* Ohm's Law states that because the current has increased, either the
voltage has increased or the resistance of the armature winding has decreased.

* Assumption: because the drive circuit is holding the voltage steady, the
resistance must have changed.

But if the current has increased and the voltage hasn't changed, won't the
increased current / reduced resistance balance things out?


But if you're providing a constant voltage, then the current wouldn't matter,
would it?

I know the chip is acting as a current amp (first thing I learned about BJTs
is that they're current amplifiers -- Ic = Ib * Hfe in an ideal world). So is
all the opamp / Vref circuitry really just there to hold the motor voltage steady?

If that's the case, why not just use an LM317 or similar voltage regulator?

Because I can't find the BA6220 in any of my suppliers' catalogues -- it seems
to be one of those odd parts that exists, but only in China and the
surrounding area...

I'm rebuilding the entire drive circuit for the printer mechanism so I can use
it to print labels from the PC; adding a motor speed controller at the same
time wouldn't take much longer. I can rig up a PWM controller with a PIC micro
and a MOSFET, then monitor the back-EMF with the A/D converter when the PWM
output is off.

I think I need to get a couple of motors and a current shunt out of my junk
box and do some tests...

Thanks.
--
Phil.
[email protected]
http://www.philpem.me.uk/
If mail bounces, replace "08" with the last two digits of the current
year.

The motor on the 6220 chip is NOT running with constant voltage as if
supplied by a 317 or other regulator. It is running with constant back emf
and thus constant speed. The purpose of the 6220 is to sense this back emf
and separate it from the supply voltage.

Think of it this way, any DC motor when spinning acts as a generator and
creates a generated voltage. This voltage is the back emf and is totally
separate and distinct from the forward voltage driving the motor. Therefore,
there are two components to the voltage that appears on the motor windings,
the forward voltage minus the back emf. The back emf is the one proportional
to speed, but you can't get to it directly unless the motor is disconnected
from its source.

However, you can get to the back emf by using some algebra and basically
subtracting the driving voltage from the total voltage. The 6220 chip does
this internally.

If you can't find a 6220 or equivalent chip, you can make your own back emf
bridge with several resistors and an op amp arranged to do this algebra and
extract the back emf.

If you are interested, I can go into more detail. montassocatyahoodotcom
 
J

Jamie

Tim said:
The only trick is making the negative output impedance match the
armature impedance of the motor. I'm sure that there's a bit of
tweaking involved in getting the figure right, and in compensating the
amplifier so it's stable when you hang a motor off of it, but it's been
done for years.

AFAIK this is how the capstan speed of all but the cheapest or most
expensive cassette recorders was controlled.
Oh, you mean they don't use centrifugal contact regulators any more? :))



http://webpages.charter.net/jamie_5"
 
J

Jan Panteltje

Oh, you mean they don't use centrifugal contact regulators any more? :))

That is exactly what was in the motors.
Somebody here? (...) did one of the latest walkman, and used a 3 phase motor,
driven by some micro or something.
As I have one, I can claim that the centrifugal ones worked a lot better.
Hmm, maybe I put it in the garbage... mmm lemme see... cannot find it...
mp3 solved that problem.
Strange, never would have thrown it out without getting that little 3 phase motr out...
maybe the jangling sound pissed me of too much ;-)
Was it an 'Aiwa?', strange ffwd / reverse mechanics too.
 
J

Joerg

Tim said:
The only trick is making the negative output impedance match the
armature impedance of the motor. I'm sure that there's a bit of
tweaking involved in getting the figure right, and in compensating the
amplifier so it's stable when you hang a motor off of it, but it's been
done for years.

AFAIK this is how the capstan speed of all but the cheapest or most
expensive cassette recorders was controlled.

They even did that on record players. Couldn't believe it until I saw it.
 
J

Joerg

Jim said:
Not on my Rek-O-Cut (SP? after all these years :), it had a
synchronous motor and a tapered pulley so you could set the speed with
a strobe.

Ah, I remember the old strobe disk. Having to switch taper would somehow
hint that the line frequency wasn't all that stable out there. Mine only
had tapers for the various record speeds. Why a strobe disk came with it
I have no clue because if the speed was off you couldn't do anything
about it. Plus it was relying on line powered lights so how could it
ever show a deviation? It would be like trying to calibrate a frequency
counter with its own clock.
 
J

Joerg

Jim said:
Mine tilted the motor mount ever so slightly to change the BELT
position on the pulley to set the speed accurately.

I still have it. I'll get it down from the upper storage shelving and
take a picture.

But against what would it compare? The TV station's V-sync? I can't
imagine they provided a crystal controlled strobe lamp with it.
Sunchronous motors have no slip. They are either in sync or stalled.
 
J

Joerg

Jim said:
Variable pulley ratio... you dig?

No grok here :)

Again:

Synchronous motor -> no slip -> always in sync with your utility's 60Hz.

Strobe disk -> lamp -> lamp also always in sync with utility.

So, where's the value in this measurement?
 
J

Joerg

Michael said:
Joerg, if the turntable was running the wrong speed, it needed
serviced. Running slow was usually bad or dried out lubricant, and
running fast was the rubber breaking down and building up a layer of
crud on the motor shaft. The strobe disks came with a neon lamp, but in
the shop we just used the fluorescent lighting for the 120 flashes per
second. A decent turntable would have the strobe sit completely still,
after proper service was performed. GC Electronics sold the proper
lubricants and cleaning chemicals, as well as the strobe disks.

All I know was that all the players I saw didn't slow down, they
stopped. That's because nearly all were driven by synchronous motors
which do not slip but stall in overload. In Europe you just got the
disk, no strobe light. I still don't know what the disk was for. All it
did was to confirm that the 50Hz going into the turntable were the saem
50Hz going into the lamp. Which I kind of always knew :)
 
J

Jan Panteltje

Then they'd have to toss a whole part of lecture at our university :)

Read the short paragraph about synchronous motors:
http://www.reliance.com/prodserv/motgen/b9652_s.htm

And these guys know, they build some huge motors.

Maybe. but maybe they never used them ;-)
From:
http://en.wikipedia.org/wiki/Synchronous_motor
<quote>
synchronous electric motor is an AC motor distinguished by a rotor
spinning with coils passing magnets at the same rate as the alternating
current and resulting magnetic field which drives it. Another way of
saying this is that it has zero slip under usual operating conditions.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
<end quote>
I have underlined 'under usual operating conditions'.
When synchronous motors are used, with varying loads, they may very well slip,
and if in audio equipment cause horrible modulation of the sound pitch.
An example is the little 3 phase synchronous motor in the walkman I have (or had?)
If the tape sticks a bit, or the cassette mechanism is sluggish, then a singer
sounds like a highly stressed freaking out person... (as it jumps phases).
For the rest it will just keep running, dropping phases.

So, 'either in sync, or stalled' is not correct, no matter what university wrote it.
No matter where in the universe, and no matter when.
 
P

Philip Pemberton

Tim said:
The only trick is making the negative output impedance match the
armature impedance of the motor. I'm sure that there's a bit of
tweaking involved in getting the figure right, and in compensating the
amplifier so it's stable when you hang a motor off of it, but it's been
done for years.

True.

With the aid of a cheap laser diode, a phototransistor and an oscilloscope,
I've measured the motor speed to be around 3000RPM (+/- about 100RPM as the
motor load changes). Now I just need to track down a MOSFET and a
microcontroller of some description, and rig up some form of speed controller.

Thanks for all your help guys, it was much appreciated.
 
Top