Maker Pro
Maker Pro

Larkin, Power BASIC cannot be THAT good:

M

Martin Brown

John said:
I don't think the culture, in academia or in business, plans for
quality. The big issue that Computer Science should be addressing, the

They have but the young new programmers don't often get the chance to
practice what they have been taught when they go into industry.
issue where they could actually affect society in a meaningful way, is
programming quality.

It's my opinion that high-quality software is on average cheaper to
make than buggy software.

That has been known for decades the IBM analysis of formal reviews as a
method of early detection of specification errors in 1981 highlighted
this early on. Popularised version in the "Mythical Man Month". Formal
reviews saved them a factor of 3 in bug fixing costs and reduced the
defects found after shipping by a factor of 4. It still isn't standard
practice except in the best places.

In the UK the problem with every large government computer project stems
from the fact that the managers are innumerate civil servants. The new
ID card thing should be hilarious. One database to bind them all...

The real difficulty is in persuading customers that they do not want a
quick hack to give them something ASAP. Most times that is exactly what
they ask for even when they don't yet know what they want it to do. It
is the job of the salesman to get the order in ASAP so guess what happens...

Add in a few of the political factors like the guys who can see that if
the computer system works right they will be out of a job and you can
see how the specifications end up corrupt at the outset.

If you tried to build hardware with the sort of "specifications" some
major software projects start out with you would have a rats nest of
components miles high by the time it was delivered.
Consider an X-Y plot: X axis is programmer experience; Y-axis is bug
density. The engineering units are admittedly fuzzy, but go with the
concept.

So far so good. The problem is that the variation of the position of
these lines on the graph for different individuals is more than an order
of magnitude. That is the worst programmers have a defect curve that is
more than 10x higher than the best practitioners. And there are not
enough good or excellent programmers. You cannot change the availability
of highly intelligent and trained staff so you have to make the tools
better.

BTW don't blame the programmers for everything - a lot of the problems
in the modern software industry stem from late injected feature creep.
I think the popular languages and culture tend to make the droop down
fron the initial start, but then flatten out or even start to curve
back up. More experienced programmers go faster and use trickier
constructs and the newest tools to keep their bug rate up.

The newest tools like static testing and complexity analysis are all
about detecting as many common human errors at compile time as possible.
It is telling that the toolset for this is *only* available in the most
expensive corporate version of MS development tools.

When they should be in the one sold to students!!! A problem with
student projects is that they are small enough that any half decent
candidate can rattle them off with or without the right approach.
Now consider plotting graphs in different colors for different
languages: Fortran, Cobol, C, C++, ADA, Perl, Python, Java. Are we
making progress?

If you want a safe well behaved curve then something like one of Wirth's
languages Pascal or Modula2 is about as good as it gets (for that
generation of language). Java is pretty good for safety if a bit slow
and Perl has its uses if you like powerful dense cryptic code. APL is
even terser and has seen active service in surprising places.

Domain specific languages or second generation languages augmented by
well tested libraries can be way more productive.

Mathematica is one example.

Regards,
Martin Brown
 
J

Jasen Betts

On Jun 9, 2:18 pm, "Joel Koltner" <[email protected]> wrote:
How about for Microsoft SQL Server?

I'd looove to migrate my SQL Server code in VBA to C. Any ideas?

can VBA call DLLs? C can be used to write DLLs.

what sort of things are you doing in VBA that you want to do in C?

It's fairly easy to do stuff to postgresql in C
C is one of the many languages available for writing stored procedures.
bad C can shoot your database in the foot.
 
J

Jasen Betts

Video? In ASM? Hah, just the other day I tried getting my breadboarded Z80
to produce NTSC composite. Just bit banging out of a spare data bit or two.

No good, one instruction takes about an inch of scan, no time even to read
the next line, and the horizontal sync is terribly unreliable due to the
variable interrupt latency. With lots of prodding I made a few stable
pictures, nothing interactive. Definitely needs support hardware. An AVR
at 16MHz would probably do nicely though.

scan rate is about 16Khz, so about 1000 clocks per line, if you use the
USART or SPI for output could probably get VGA resolution. :)
 
T

Tim Williams

Jan Panteltje said:
It sucked for me because it did not match my imagination.
TV in low resolution leaves things for the imagination, this is how the
Hollywood
wild west cities are build, just the front, it just looks real, as the
imagination fills in what
is not there.

I feel the same way about old games. Doom is the greatest. Who needs
accelerated video when you've got 320 x 200 x 256?

Tim
 
N

Nobody

If you want a safe well behaved curve then something like one of Wirth's
languages Pascal or Modula2 is about as good as it gets (for that
generation of language). Java is pretty good for safety if a bit slow
and Perl has its uses if you like powerful dense cryptic code. APL is
even terser and has seen active service in surprising places.

Perl is the last language you should be using if you want robustness. It
actively encourages writing code which works 99% of the time; getting that
last 1% requires a tenfold increase in effort.

Any task which involves reading structured text should begin with a
robust parser which parses everything, not just the outermost layer. The
rest of the program operates on data structured as lists, tuples,
structures, objects, dictionaries, etc.

Manipulating data as structured text using string manipulation functions
is the road to hell. I'm convinced this approach has created more security
flaws than buffer overflows ever have.

Sure, you *can* write correct code in Perl; it just makes doing a
half-arsed job so much easier than doing it right.
 
R

Rich Grise

Funny. I would use the 25480, and put the 24*60*60 in a comment.

Then again, I don't need to write high-performance code.
I hope not! I define the constant, then use the label. That way,
if it changes, you don't have to track down every instance to
change the hard numbers.

Cheers!
Rich
 
N

Nobody

There's another interesting graph: X axis is how hard (or expensive)
it is to change a product in the field, and Y axis is bug density.

The curve dives down. Serious hardware and software designs that are
hard to update (like hard asics, auto engine control computers, or the
stuff inside flat-screen TVs, or military bear) are debugged pretty
hard before being shipped. Mediun things (fpga's, prodcuts that are
easily upgraded with a flash stick) are in-between, Stuff that gets
weekly updates over the web are often horrible.

All of which reinforces the view that the software industry really does
know what it's doing. If reliability actually matters, you get
reliability. If it doesn't, you don't.

The extent to which customers might want reliability only matters insofar
as it affects their purchasing decisions.
Yes, and that makes people care less about bugs. And when a product is
so complex that each bug fix spins off more bugs...

That's mostly just Microsoft. When Linux vendors issue bugfix releases,
they usually are just bugfixes, not version upgrades.

When I maintained Linux servers, I only bothered using a test server for
upgrades. Bug fixes went straight onto the production servers, and I never
encountered a regression as a result. For security fixes, leaving an
unpatched server facing the internet for another hour was a bigger risk
than the patch itself causing problems.
 
N

Nobody

Sorry late reply, busy here... anyways I am not sure the quest for ever
higher resolution graphics and ever more realistic rendering makes much sense.

Maybe not, but there are a few boundaries where a quantitative improvement
results in a qualitative improvement.

In 3D, it helps if you can manage ~24fps to obtain fluid animation. If
it's too slow, you lose a lot of immersion.

Similarly, there's a world of difference between flat-shaded or
Gouraud-shaded polygons and texture-mapped polygons.

This was one of Doom's main strengths: it was the first texture-mapped 3D
game with fluid animation. Quake aimed to do the same thing for true 3D,
without having to lock the vertical axis.

Everything since then has really just been bigger numbers (mostly
just higher polygon count).

But there's a general principle here which goes beyond Quake. There are
plenty of practical applications where you're dealing with continuous
(analogue) data and you need results which are accurate to a given
tolerance. Being able to use an approximation may result in an order of
magnitude difference to the number of CPU cycles required for the task.
Often, the accuracy/performance curve can have sharp kinks or even steps,
the location of which are determined by fairly low-level architectural
details.
 
A

Anssi Saari

C:\files\per\mplayer>mplayer
MPlayer 1.0rc1-3.4.2 (C) 2000-2006 MPlayer Team

Gawd, that old thing might actually predate SSE2 :) Since
unfortunately the mplayer developers can't put together a release,
their current SVN commit is the thing to use. Or get something vaguely
recent, I guess for Windows you want to get SMplayer from
http://smplayer.sourceforge.net/
 
M

Martin Brown

Nobody said:
Perl is the last language you should be using if you want robustness. It
actively encourages writing code which works 99% of the time; getting that
last 1% requires a tenfold increase in effort.

I could not agree more. But I also wanted to make the point that for a
quick hack it can also be very powerful if you stay within the envelope.

I was intending to contrast the last two with the others which are
robust but I see now that you could read it another way. It is the old
ambiguity problem with natural language descriptions.
Any task which involves reading structured text should begin with a
robust parser which parses everything, not just the outermost layer. The
rest of the program operates on data structured as lists, tuples,
structures, objects, dictionaries, etc.

Manipulating data as structured text using string manipulation functions
is the road to hell. I'm convinced this approach has created more security
flaws than buffer overflows ever have.

I'm not so sure.
Sure, you *can* write correct code in Perl; it just makes doing a
half-arsed job so much easier than doing it right.

I am not so purist as to insist on perfection. A write once and throw
away program can be done in whatever language makes the job easiest. The
only tricky thing is making sure that the code really works as intended.

Regards,
Martin Brown
 
N

Nobody

Although I did once meet a C compiler that pretty much ignored the "const"
part of "const int" and therefore actually allocated memory (in RAM!) and
stored the value, fetching it again every time it was needed.

It's required to store it somewhere so that you can take its address.
"const" variables are still lvalues, not expressions.
 
N

Nobody

I'm not so sure.

I haven't conducted a rigorous survey, but that's my impression from
being subscribed to BugTraq for the last 13 years.

Injection attacks in (mostly) PHP scripts are the low-hanging fruit of
vulnerability analysis; the phrase "shooting fish in a barrel" comes to
mind. Some days you might see a dozen posts from one wannabe security
researcher who just went looking for the most obvious bugs in PHP scripts,
and found no shortage of them.

Google says:

Results 1 - 10 of about 154,000 for "sql injection attack"

Okay, so by page 10 it's lowered its estimate to 73,400, but it's still
not exactly obscure.
I am not so purist as to insist on perfection. A write once and throw
away program can be done in whatever language makes the job easiest. The
only tricky thing is making sure that the code really works as intended.

Security doesn't matter if the program will only be run by its author on
data created by its author; there's no mileage in exploiting your own
account.

Unfortunately, code which reads and/or writes structured text formats is
frequently used in exposed environments, either on web servers or for
processing data obtained straight off the 'net.
 
M

Martin Brown

Nobody said:
It's required to store it somewhere so that you can take its address.
"const" variables are still lvalues, not expressions.

A lot of the older compilers did that copy to ram even when they
purported to generate code for embedded systems. The solution was to do
the constant tables in assembler and then export a reference to them.

The physical address of a const in ROM is known at link time. And you
can be sure that no attempt by the CPU to trash a ROM value will ever
succeed. The same cannot be said of a value in ram if things go haywire.

This immutability of ROM did cause amusement in bank switched user
register CPUs like the 9900 and 99k. You could tell it was in big
trouble if the user register bank was in ROM where incrementing the
program counter doesn't work any more.

I never appreciated at the time just how good the 99k was at interrupt
handling until I tried to do the same job on a 68k series.

Regards,
Martin Brown
 
M

Martin Brown

Nobody said:
I haven't conducted a rigorous survey, but that's my impression from
being subscribed to BugTraq for the last 13 years.

Injection attacks in (mostly) PHP scripts are the low-hanging fruit of
vulnerability analysis; the phrase "shooting fish in a barrel" comes to
mind. Some days you might see a dozen posts from one wannabe security
researcher who just went looking for the most obvious bugs in PHP scripts,
and found no shortage of them.

Google says:

Results 1 - 10 of about 154,000 for "sql injection attack"

Okay, so by page 10 it's lowered its estimate to 73,400, but it's still
not exactly obscure.

I think I will concede the point entirely where PHP Internet exploits
are concerned that is a lost cause. But I am not sure if the problems
are the fault of processing structured text by string manipulation or
inadequate safeguards in the script language.
Security doesn't matter if the program will only be run by its author on
data created by its author; there's no mileage in exploiting your own
account.

Yes. I had in mind jobs which are essentially turning the rubbish
formatted dump of raw data that some manufacturer outputs into a format
that is useful to their customer. You would not believe the number of
expensive instruments that output measurement data in the most user
hostile and bulky formats possible.
Unfortunately, code which reads and/or writes structured text formats is
frequently used in exposed environments, either on web servers or for
processing data obtained straight off the 'net.

And that is where it is very prone to abuse. I am still inclined to
think that the problem is more with the current implementations and
sheer number of bad exploitable scripts around than with the concept
itself. Easy to use tools without safety guards is never a good idea.

Regards,
Martin Brown
 
M

Martin Brown

You are confusing "excellence" with quality here.

If they did not produce a product with *adequate* quality then customers
would not buy it and the company would not make a profit. Microsoft must
be doing something right since the suckers are all buying Office 2007.
There's another interesting graph: X axis is how hard (or expensive)
it is to change a product in the field, and Y axis is bug density.

The curve dives down. Serious hardware and software designs that are
hard to update (like hard asics, auto engine control computers, or the
stuff inside flat-screen TVs, or military bear) are debugged pretty
hard before being shipped. Mediun things (fpga's, prodcuts that are
easily upgraded with a flash stick) are in-between, Stuff that gets
weekly updates over the web are often horrible.

But that is pure market economics. If it is cheap to fix in the field
then it ships earlier and the company banks the cash. The CEOs duty is
to maximise shareholder value and his own bonus (sometimes mostly the
latter). CPU designs are even more carefully checked and simulated
before they go into production on a fab line. It all depends on the
upfront capital costs to manufacture and the cost of fixing in the
field. You may not like it, but when the in service fix is almost free
to the supplier then they will exploit that to their advantage.

The company managements job is to find the line between a perfect
product delivered too late to make any money at one extreme and a buggy
one that fails to sell because it doesn't work.
I see this in my own work: hardware designs are checked exhaustively,
and we go out for a batch of expensive multilayer boards (we dom't
prototype!) and they are usually right. Assembly programming, I'm
pretty careful because the assemble-install-test loop is tedious.
On-screen programming is pretty much type and ignite and see what
happens.

You should probably acquire some better in circuit debugging tools then.
Post mortem and realtime debuggers have advanced a long way. Even the
humble PICs have advanced and cheap debuggers available these days. Bugs
are always cheaper to fix the sooner they are caught so it makes sense
to use the best tools available for the job.

Type and ignite has never been a good method. The coherence length of
code written at a terminal has a bad habit of being equal to the number
of lines that fit on a display page. Everyone does it for tiny throw
away programs but it isn't advisable for anything bigger.
Yes, and that makes people care less about bugs. And when a product is
so complex that each bug fix spins off more bugs...

Some of the complexity is unavoidable in very large systems. Your
experience with one programmer few months projects does not scale well.

A rough guide is that every attempted non-trivial bug fix in a large
project has a 50% risk of causing an unwanted side effect. It means some
are worth leaving in if there is a workaround.

I always recall IBM's FORTRAN G compiler which was so uncertain of its
world view that the result of a successful compilation was:
NO DIAGNOSTICS GENERATED?

Debug showed that the string length of the message was miscounted and
the trailing NUL was being printed as "?". Rumour had it that their
change procedures made it too onerous to correct this minor cosmetic fault.

Regards,
Martin Brown
 
M

Martin Brown

John said:
Sigh. I suppose "quality" now means that you can ship it and charge
for it and expect thjat it won't crash too bery often and that your

Yes. Pretty much that is the decision that the bean counters and suits
whose bonuses depend quadratically on shipped product value will take.

I am not defending these practices I am merely pointing out how it is.
customers will find the remaining hundreds of crashes and security
vulnerabilities by experiment.

There are a couple of reasonable responses to this: use simpler,
preferably free, stuff that does the basic job; and if your PC works
and is finally mostly stable, don't upgrade.

That is fine for shrink wrap consumer software but it doesn't even begin
to address the issues of major commercial projects.
So where are the class-action lawyers when we really need tham?

I hope your defect rate when typing is a lot better when you write programs!
Well, I just think that the tradeoff could be shifted a lot in the
quality direction, without extra expense, and with *sooner*
deliveries, with different methods and culture. I don't think that
security vulns are a marketing tool, especially when everybody now
expects that the next release will be just as bad.

The trouble is that we tend to hope and pray that the next version will
be better. Vista was a dog following after XP which was pretty usable
and is still being installed on new kit by savvy corporates.

The next 'Doze version does look like it might be an improvement only
time will tell.
The world has suffered mightily from the personality defects of Bill
Gates.

Digital Research were too cocky by half with IBM over MSDOS. They let
Gates get a foothold and win the deal. ISTR IBM licenced MSDOS for $1
per PC through not realising quite how many PCs would get sold.

It isn't just Microsoft. Although they do seem to have a higher defect
rate than is ideal they also have some very talented people like Steve
McConnell who have written very sensible texts on best practice for
defensive programming. Trouble is that it all gets forgotten in the
final rush to finish, test and ship. Sad thing is the humble programmers
typically just get stuffed with free pizzas, jolt and unpaid overtime.

It is a great shame IBM & MS fell out over OS/2. That platform was very
close to being a robust OS for consumer PCs but Windows was flashier and
the world chose slick Willys Windows over IBMs staid OS/2. I think there
may still be a few banking systems and airtraffic controls running on OS/2.
My embedded stuff works just fine. Things get done quickly and ship
bug-free, because I'm careful.

And the project size is relatively small.
My on-screen stuff is mostly engineering utilities or one-time calcs;
I'd never ship quickie things like this. But of you're arranging a
screen display, the easiest thing to do is run the code, see how it
looks, and tweak for beauty, as opposed to planning every character
location in advance. Small design iterations can take, literally, 10
seconds. Embedded stuff I write, assemble, read, tweak, read, until it
looks perfect, before I ever run it. Reading is a much better way to
debug than testing. Reading is also a lost art in many circles.

I agree. An annotated paper listing or a diagram is hard to beat.
In big systems, roughly half of the bugs are invisible, being module
interactions, so no module author is likely to spot them by reading
his code. So bugs are found by testing, many of that testing done by
users. There must be a better way.

This isn't entirely true. Most of the big project integration faults
stem from ambiguities in the original specifications but they hit home
only when the modules interact. A lot of errors could be found earlier
by the right sort of inspections and walk throughs.

But you do still have a big problem with system integration of large
projects where N modules have N(N-1) possible interactions. For N=1 or 2
this isn't a big deal for N=100 or N=1000 you have to be exceptionally
careful to avoid unintended side effects.

Regards,
Martin Brown
 
N

Nobody

I think I will concede the point entirely where PHP Internet exploits
are concerned that is a lost cause. But I am not sure if the problems
are the fault of processing structured text by string manipulation or
inadequate safeguards in the script language.

Injection attacks arise from constructing structured text formats by
directly manipulating the text. The code inserts a string at a point in
the text with the intention that the string will correspond to a node in
the parse tree. But if the string contains characters which are
significant to the parser, it completely changes the overall structure.
E.g. (using C syntax):

sprintf(cmd, "SELECT * FROM mytable WHERE mycolumn = '%s';", value);

This works fine until the attacker does:

/* note the unmatched single quote */
value = "foo';DROP TABLE mytable";

The intention was to "graft" a literal string at a specific point in the
parse tree for the SQL command, but you end up with a completely different
parse tree.

The reason why they're so prevalent in PHP is that the language
is designed primarily for interfacing using textual formats,
particularly HTML and SQL, but the language designers pushed the task of
constructing the data (which is harder than it looks) onto the users.

A saner approach would have been to require SQL, HTML, shell commands, etc
to be constructed through a DOM-style interface, with the language taking
responsibility for generating text with the correct structure. Then the
task would only need to be done once, by (hopefully) experienced
programmers, rather than thousands of times by novices.
And that is where it is very prone to abuse. I am still inclined to
think that the problem is more with the current implementations and
sheer number of bad exploitable scripts around than with the concept
itself. Easy to use tools without safety guards is never a good idea.

It's a lack of forethought (and simple laziness) from designers. Need
to provide an interface to SQL databases? Easy:

int execute_sql(const char *cmd);

Need a way to run external programs? Even easier:

int system(const char *cmd);

Need a way to output HTML: document.write().

This approach guarantees that strings obtained from who knows where will
end up being passed verbatim to the functions. Even more so when
the language encourages the use of strings as the primary datatype.
 
K

krw

Brattain was very nice. I was a high-school student and won a trip to
Bell Labs in a science fair, and I guess one of his functions was to
have lunch with brats like us. I did "meet" Teller at LLNL, as we were
both going through security to get in. He was old and frail and they
treated him like God.

John Bardeen was very nice too. My mother knew him fairly well (he
was a faculty member with my father, though they didn't know each
other well). My mother said that in a crowded room, he would have
been voted the least likely to be a (double) Nobel winner.
I knew one of the inventors of cmos imager chips. The first ones were
just standard drams with the lids removed, with refresh leakage making
2-level images. Can't remember his name now.

I almost met Lyndon Johnson once. How's that?

My wife's third cousin (or maybe my MIL's, can't remember).
 
K

krw

We were actually in the Oval Office, waiting for him, when something
happened and he had to split. We did see his fat behind running across
the lawn to get into a helicopter.

Gave us the Viet Nam war, he did. And The Great Society.

The worst part of the Vietnam War, anyway. Yep, by wife whacks me
every time I tell someone she's related to him. ;-)
 
Top