Maker Pro
Maker Pro

Larkin, Power BASIC cannot be THAT good:

M

Martin Brown

John said:
Not only did I not write Excel, I never use it. Spreadsheets are
idiotic toys.

I think you have missed the point of spreadsheets and other scratchpad
tools. They allow accountants and scientists to get results from modest
amounts of data without them having to learn how to program in detail.

They are also excellent for creating test data using a method that has
completely different characteristic modes of failure to classical
programming languages.
How about this:

http://www.highlandtechnology.com/DSS/V346DS.html

It does all the internals - lookup, DDS, interpolation, modulations,
summing, user-programmable microengines - at 128 MHz on 8 channels,
the equivalent of roughly 40G saturating math operations per second,
on a cheap Spartan FPGA. Well, we did epoxy a heatsink on top. Of
course, engineers can parallelize and pipeline, but programmers can't.

Your hardware is cute. But some of the statements you make about
software engineering are risible.
Why are you confusing the simple terms "linear search" and "sort"? How
very strange.

Most of the fast binary search methods rely on an ordered array of
target data. You have to pay for that sort at some stage.

But if you want a pure O(1) solution to replace a linear search then
hash tables can be extremely effective.
And what difference would it make, anyhow, whether an algorithm scales
or not if it doesn't need to scale?

If it doesn't need to scale then it makes no difference at all. But you
need to be *very* sure of that. Programs that originally run
inefficiently on small datasets tend to get used on bigger ones until
they grind to a standstill.

Regards,
Martin Brown
 
M

Martin Brown

Nobody said:
FP math is pretty much essential for (real) graphics.

No it isn't. A 32 bit fixed point or scaled integer can represent and
specify a position with 1 micron resolution over a 4km range.

The largest commonly used plotters are about A0 or a couple of metres
wide. You can do 1000ppi and 65.535" width in 16 bits though it tends to
be limiting.

It only gets really hairy if you want to zoom in dynamically. Floating
point for graphics is certainly *easier* to work with but it isn't
essential.
Integer-based graphics APIs have (or had) a place in low-resolution video
displays where you can see individual pixels, and where the graphics are
dominated by unscaled bitmaps and (mostly orthogonal) straight lines.

Integer coordinates don't make sense if you're operating at 300+ DPI, in
physical dimensions (mm or points rather than pixels), with scaled images,
thick (>1 pixel) lines, Bezier curves, arbitrary transformations, etc.

Integer or fixed point implementations are one of the ways it is done -
especially on smaller portable systems with limited grunt like phones.

Regards,
Martin Brown
 
N

Nobody

Yes. Most programming, as a process, is broken.

Only if you look at it from a technical or academic perspective. From a
business perspective, the process is quite well tuned.

Like anything, performance and reliability are worth whatever the market
is willing to pay for them, no more and no less. If the market prefers
cheap crap over paying for quality, you get cheap crap.
 
A

Anssi Saari

Jan Panteltje said:
How fast do you have to go anyways?
The latest H264 encoders use Nvidia graphics cards as hardware accelerator.
that is so specialised, if you design that sort of soft, then yes, maybe
you need to be wizard in x86 asm:
http://www.mediacoderhq.com/
Do they use asm? perhaps?

I don't know about them, but to my knowledge ffmpeg and mplayer use a
lot of asm.

As far as I know, there isn't much of automatic vectorization in C
compilers either, so any non-trivial use of SSE and such needs hand
coded assembly.

OTOH, I think there was a fairly recent "survey" on slashdot about
assembly. The result was that few slashdot readers use it, but then I
read it as "slashdot reading perl and php coders don't use assembly".
Kind of no brainer...
 
N

Nobody

But in my experience, the worst software is the most expensive.

Niche software is generally both low quality and expensive; the
alternative would often be reasonable quality and unaffordable.
A lot of the best software is free.

True, but that's not created on a commercial basis.

And a lot of the worst software is also free.

Free software is liberated from the need to turn a profit, so there's no
product sabotage to preserve sales of the enhanced version, or chasing
meaningless bullet points at the expense of useful functionality.

OTOH, it's also liberated from the need to cater to what users actually
want (as opposed to e.g. what the developer thinks they ought to want).
 
N

Nobody

So I looked up that chapter.
But you know. these days, I write in C, and use gcc as compiler.
I have come to trust gcc, its code generation, its libraries.
So I could not care less if the processor targeted (I use other
platforms then x86 too) had a hardware instruction to do every thing my
program needs, As long the as the compiler knows it :)

Whether the compiler can make use of it may depend upon your algorithm.

E.g. one of Quake's key optimisations was that it only performs one
perspective division for every 16 pixels. This is much faster than
performing a division per pixel, but visually almost as good; you won't
see the difference unless you're looking for it.

Why 16 pixels? Because that's how many pixels it can render using the
integer ALU while the FPU is (concurrently) performing the division.

You could write in C and still get the optimisation (with a sufficiently
good compiler), but you would need a reasonable understanding of the x86
in order to structure the code in such a way as to enable the optimisation.

The compiler won't perform approximations for you, and you can't test
every possible approximation for performance.
 
T

Tim Williams

Jan Panteltje said:
It can be fun programming in asm to get max speed, I have done video on a
PIC
in asm for example, but if you have enough processing power why bother?

Video? In ASM? Hah, just the other day I tried getting my breadboarded Z80
to produce NTSC composite. Just bit banging out of a spare data bit or two.

No good, one instruction takes about an inch of scan, no time even to read
the next line, and the horizontal sync is terribly unreliable due to the
variable interrupt latency. With lots of prodding I made a few stable
pictures, nothing interactive. Definitely needs support hardware. An AVR
at 16MHz would probably do nicely though.

Tim
 
R

Rich Grise

My hardware and my software work. Both are reliable, documented,
reproducible, formerly controlled, bug-free, and profitable.
^^^^^^^
formally

Good one, huh?[/QUOTE]

;-)

Cheers!
Rich
 
K

krw

Such as being impossible to debug.

Spreadsheets are debugable, at least as much as any interpreted
language.
I'm with John on spreadsheets (though not on BASIC)--they're famous for
generating reasonable-looking wrong answers, and (unlike programming
languages) their testing facilities are zilch. Sort of like the old
Royal-McBee drum memory computers, where every instruction contained the
address of the next instruction--every line had a GOTO.

I'm certainly not. I use them all the time to debug everything from
BOMs to database errors. They also make great tables and simple
charts-n-graphs.
Brr. Give me Mathcad or Octave for numbers, and little scripts for
other sorts of data. Spreadsheets--blech.

They're perfectly acceptable tools, with the understanding that they
are just tools.
 
K

krw

Free software is liberated from the need to turn a profit, so there's no
product sabotage to preserve sales of the enhanced version, or chasing
meaningless bullet points at the expense of useful functionality.

OTOH, it's also liberated from the need to cater to what users actually
want (as opposed to e.g. what the developer thinks they ought to want).

You imply that the "meaningless bullet points" are somehow different.
 
K

krw

One of my biggish customers, in one annual report, notes that "we have
almost recovered from the Oracle conversion." Occasionally I get their
ECOs and they make absolutely no sense; they don't seem to understand
them either.

AIUI, our part number and ECO tracking software company has been
bought out by Oracle so some day we'll have to migrate over. It's a
mess now so I can only imagine what an adventure of Titanic
proportions that'll be.
 
N

Nobody

You are making the common argument that better software has to cost
more. What I see is that some people write good code and some don't.

Not that it *has* to, just that it often does.

Once upon a time, IBM estimated that the computer market amounted to
around 20 computers in the entire world. Nowadays, everyone has one, and
the software industry is so large that there simply aren't enough "good"
programmers to go around. In this environment, some degree of mediocrity
is inevitable.
We use PADS for pcb layout; it has no known-to-me bugs, it's fast, and
it never crashes. LT Spice is great. Agilent's Appcad ditto. Irfanview
and Crimson are great. Firefox and Thunderbird too. Most Microsoft and
Adobe products are slow and buggy.

That's because Microsoft and Adobe are trying to corner the market, which
means that their products have to do absolutely everything that any
similar product might possibly want to do, otherwise they leave enough of
a market for a competitor to survive. This pretty much guarantees
bloatware.

Free software (or even less ambitious commercial software) will tend
to stick to what it's good at; or at least not bother trying to do what
it's particularly unsuited for.
 
J

James Arthur

John said:
But in my experience, the worst software is the most expensive.

Writing bad software is expensive. Bad hardware, too.

James Arthur
 
N

Nobody

Video? In ASM? Hah, just the other day I tried getting my breadboarded Z80
to produce NTSC composite. Just bit banging out of a spare data bit or two.

No good, one instruction takes about an inch of scan, no time even to read
the next line, and the horizontal sync is terribly unreliable due to the
variable interrupt latency. With lots of prodding I made a few stable
pictures, nothing interactive. Definitely needs support hardware. An AVR
at 16MHz would probably do nicely though.

You need a shift register, so that it's one iteration per 8 pixels
rather than one per pixel. And you still only get 32 columns with a
3.25MHz Z-80 (that's the spec. of the ZX80/81, whose video hardware was an
8-bit shift register, with everything else in software).

If you want "pixels" (as opposed to character-sized blocks) on a PIC, you
would realistically need to use a 20MHz clock (Fosc/4 = 5MHz), and use
either the USART or SSP as a shift register.

Otherwise, you're stuck with using a whole 8-bit port so that you can
shift out the bits with consecutive "rrf PORTC" instructions. And you get
gaps due to the instruction required to load the next byte.
 
K

krw

In a normal interpreted language, each statement follows the one ahead
of it, unless there's an explicit control structure. (Or a named label,
but we're assuming reasonable practice here--use GOTOs to escape from a
deeply nested loop to a nearby point, say, but not otherwise.)

Intermediate cells can be used for debug. Cells/sheets are named.
Any spreadsheet cell, on the other hand, can depend on any other
spreadsheet cell, without warning and without any way of finding that
out in general other than an exhaustive examination of each cell.

Not true. Excel, at least, shows the come-froms.
_Your_ spreadsheets, I'm quite sure, are sensibly structured and
reasonably comprehensible in outline. But that is far from the general
case, even for spreadsheets that have real money depending on them. A
nontrivial spreadsheet that has had more than a couple of people working
on it is a disaster to debug--far worse than spaghetti code.

You haven't seen spaghetti code. ;-) Try running a fab on a pile of
APL routines, like the one just North of you was twentyish years ago.
Yep, tables and graphs are about it.

Checking tables form other databases too.
Well, they're widely accepted, all right, but they still stink.

They're wonderful productivity tools. Just like a hammer, they're
dangerous in untrained hands.
 
K

krw

We did our own materials control software, and track ECOs essentially
manually. But we're small... about 5000 different parts, 2e6 pieces,
and 660 ECOs in the last 20 years.

We're pretty small too. 100 employees, mostly on the business side
(sales, marketing, BS), and perhaps 20 or 30 in production. Only seven
engineers (two hardware, one mechanical, one layout, and three
firmware).

Parts? Never counted them, but it's likely not a lot more than that.
Our latest ECO number was something like just over 1000 (A RoHS part
substitution on a non-RoHS product - just signed off on it). There is
no way the owner would allow the business to be run on a cobbled
together program. I'm not sure what we have is any better though. In
theory it does all the production stuff (BOM control, deviations,
etc.) It doesn't link the purchasing database and that's always an
amazement (why on Earth did we buy 15000 audio transformers or 6000
wall warts, that we may never use?).
 
K

krw

Our system isn't cobbled together. We designed it to do exactly what
we want. It's a joy to drive. I can search for a part by keywords, see
the search results, pop into the full inventory list with starting
point from any of the search results. I can select a part and see all
the assemblies it's used on. I can select a parts list, see total
parts cost, highlight any one part, and pop back into the inventory or
where-used bit. Every part can, usually does, have an attached folder
with datasheets, photos, engineering notes, anything useful we want to
know.

Our engineering management tools to everything above. The search
isn't linear, though. ;-)
The part numbers are very, very logical and organized, as are the
description fields.

And it's blindingly fast.

ftp://jjlarkin.lmi.net/MAX.jpg


I'm not sure what we have is any better though. In

That's a separate problem. Somebody probably observed 1) we'll sell
tons of these and 2) look how good the 10K price is!

Some of that, but mostly an overly aggressive development schedule. We
can't make enough widgets now, so I don't know how the ever expected
to sell enough to use the inventory in a reasonable time. Parts
sitting on the shelf aren't free, either.
One search param we have is called BigBux: type a dollar value, and it
shows all the parts that we have that many dollars or more of. This
invokes the occasional WTF?! reaction. It also catches the situations
where somebody typoed the price of a reel of resistors as $12 instead
of $0.012 each.

;-) There is a field in the database for projected cost, but it's not
linked to the purchasing system so it's rarely filled in. Kinda makes
*engineering* difficult.

I just demanded that they order a reel of AD8566s. For the project I
only need about 1K parts but the cut tape price (DigiKey) was over 3X
the full reel price (Arrow). ...so I designed out another part since
the 8566s were free. ;-)
 
N

Nobody

I think not. Browse the manual a bit. You may learn something.

I don't need a recap; I still maintain a couple of packages which print by
feeding PostScript to lpr.

No-one uses RPN because it's user-friendly (it isn't), but because it's
simple to implement. It's no surprise that RPN never took off outside of
contexts where minimalism was a priority (calculators, printers, OpenBOOT,
embedded systems generally).

RPN doesn't need a parser, only a tokeniser, as the format is simply a
linear sequence of tokens. While [...] (array) and <<...>> (dictionary)
may look like nesting constructs, they aren't. "[" and "<<" are operators
which push a mark onto the operand stack. "]" and ">>" are operators which
pull everything from the stack until the first mark. PostScript will
happily accept any of "[1 2 3]", "<<1 2 3]" and "mark 1 2 3]" to define an
array.

{...} (executable array) *is* a nesting construct. However, as it's the
only one, implementing it is a matter of a single variable to track
the current nesting level; "{" increments, "}" decrements. If the nesting
level is non-zero, execution is suppressed. No recursion or parse stack
required.
 
M

Martin Brown

Selective memory. There is good expensive software too. It tends to
exist in niche markets where high development costs have to be spread
across a small number of units.
Yes. And management should be able to force software quality, but
usually can't.

Big company management often pay only lip service to software quality -
it is all about maximising their bonuses which usually means minimum
time to market and maximum immediate sales. Turnover in the sales staff
at software development companies means they are seldom around when the
impossible promises they made to the customer have to be delivered.

Management know full well when the first version ships that is it
riddled with bugs. XL2007 is a good recent example. You can always issue
updates that ameliorate the problems later. But a *new* version is it.
Once customers have it you can persuade other people to upgrade by
making the new file format incompatible with previous versions.

I think the cumulative updates to XL2007 (original boxed version) now
top 1GB. After all everyone that matters has broadband these days.

The big difference now is that nothing physically has to ship.

Regards,
Martin Brown
 
Top