Maker Pro
Maker Pro

Skybuck's Universal Code 5 (The Native Version)

S

Skybuck Flying

RoyalFucks welcome to my ban list.

Bye,
Skybuck.

Yeahahahaha.
 
J

JosephKK

Skybuck said:
Nonsense.

Load whatever you need into RAM and proceed from there.

That's what RAM is for.

I have 2 and even 4 GB of RAM.

I have yet to encounter a situation where the harddisk is a real bottleneck.

Bye,
Skybuck.

Stupid ****, i have had to deal with data sets in excess of 200 GB.
While ram is cheap i cannot find any mobo that supports that much ram.
 
J

JosephKK

Skybuck said:
For a harddrive manufacturer it would make sense to optimize for 0 to N-1,
to keep things simple for programmers.

You agree with that ?

I think they do that.

So file system should do that too.

<contexted snipped>

If you need it read previous ones.

Because I don't need it, because what you wrote is BS.

Bye,
Skybuck.

No everything you write is BS. You appear to be an arrogant, willfully
ignorant, garbage posting fool. Should you think that you are
otherwise, i suggest that you go down to your friendly local state
college and take several courses. You just might learn something.
 
J

JosephKK

Skybuck said:
Since windows is closed source how are you gonna to proof your claims ?

Actually the FAT file system is very well documented. The NTFS file
system is moderately well documented. Why are you so ignorant?
One could analyze the binary data but this still requires knowledge of the
file system layout and such.

Also I wouldn't be too surprised if windows xp x64 does a little back of
defragging in the background ?

Nope, tough luck, it does not. Neither does 32 bit XP nor any version
of vista.
Finally I have a very large harddisk and try to delete as few things as
possible to prevent any fragmentation from occuring.

That does not really help much. M$ OS's have massive amounts of temp
files which cause fragmentation.
 
J

JosephKK

StickThatInYourPipeAndSmokeIt said:
Horseshit. The only way a file gets fragmented is if it gets edited
since the original write and after other files have been written after
the original write. NO OTHER WAY.

That said, only dopes that want to endlessly thrash their drive
constantly defrag.

I defrag less than once a year. I have several partitions.

System screams right along.

Over-maintenance is idiocy.

FAT and FAT32 FS are accessed the same way they were in the DOS days.
NTFS is a bit more OS managed, but not much.

FAT can frag up. NTFS doesn't frag much at all.

Your knowledge of the subject is fragmented.

Wrong. FAT and NTFS write fragmented files on the very first write.
Too bad you cannot find the truth. Dimbulb / AlwaysWrong
 
J

JosephKK

StickThatInYourPipeAndSmokeIt said:
On a nearly full, already heavily fragmented drive, yes.

On any drive with huge amounts of space left on them, particularly if
it has never yet been written to, the file write will ALWAYS be
contiguous and monolithic in nature. Also, constantly "defragged" drives
will nearly always have their free space "defragged" as well, and that
will result in several GB of new file writes, of which all will be
contiguous.

Very large files will not obey the rule as much, but a few pieces of a
for the most part contiguous huge file will not take a hit for it.

A database file is the most likely candidate for fragmentation as any
changes made to the database results in a file fragment which will not be
contiguous with the rest of the file.

Another is if one is DLing two files at once to the same partition.

So, it doesn't happen "all the time". It happens when you volume is
nearly full already, and already has a highly fragmented fill on it.

If one's drive has never been filled since its original format, a new
file write will always fill a new, contiguous space on the volume.

Horseshit, i have repeated seen fragmented files on brand new installs
of XP, and 98. I have been told that there are the same issues with MS
NT 2000 and server 2003. Why do you insist on ignorant untruths?
 
S

Skybuck Flying

Anyway.

A good idea would be to do both:

1. Check if there is enough space. If so allocate the complete file.

2. If not enough space, Ask user if he wants to continue the download
anyway, and don't allocate the whole file.

3. If drive runs out of space, warn user and wait for free space, then
continue.

Nice solution me thinks ;)

Bye,
Skybuck.
 
S

Skybuck Flying

The days of the bios and complicated hd reading/writing are long gone !

You learned more from me in this thread then you will ever learn in any
college out there ! ;) :)

Bye,
Skybuck.
 
S

Skybuck Flying

Behehehe is that all.

LOL.

Get yourself:

Windows XP X64 Pro Edition.

Use a 64 bit compiler or simply 64 bit integers.

Let the virtual memory do it's working = paging.

Bye,
Skybuck.
 
M

MooseFET

Behehehe is that all.

LOL.

Get yourself:

Windows XP X64 Pro Edition.

Use a 64 bit compiler or simply 64 bit integers.

Let the virtual memory do it's working = paging.

The virtual memory system in Windows thrashes when you try to do a
large FFT on 10 byte floats. Recoding the FFT to do the shuffle in
the middle helps.
 
S

Skybuck Flying

Since when is documentation considered an algorithm ? ;)

Show me the algorithm or show me the code or shut up :)

Bye,
Skybuck
 
S

StickThatInYourPipeAndSmokeIt

Wrong. FAT and NTFS write fragmented files on the very first write.
Too bad you cannot find the truth. Dimbulb / AlwaysWrong


No, they do not. They CAN do so, but it is NOT the norm.
 
S

StickThatInYourPipeAndSmokeIt

Horseshit, i have repeated seen fragmented files on brand new installs
of XP, and 98. I have been told that there are the same issues with MS
NT 2000 and server 2003. Why do you insist on ignorant untruths?


On "brand new installs"? You're an idiot.
 
A

Archimedes' Lever

The days of the bios and complicated hd reading/writing are long gone !

You're an idiot. BIOS level writes are made all the time.
You learned more from me in this thread then you will ever learn in any
college out there ! ;) :)

Total bullshit. We learned that you are a stupid ****, and that all you
do is gum up the groups with retarded horseshit.
Bye,
Skybuck.
If only that were really true. Now **** OFF AND DIE!
 
S

Skybuck Flying

Whatever dipshit.

All I know is I don't have to write complex code.

The bios chip or the harddisk chip or whatever does that :)

I only specify the sector number.

I don't have to specify the head, cylinder or whatever.

Bye,
Skybuck.
 
V

Vladimir Vassilevsky

The virtual memory system in Windows thrashes when you try to do a
large FFT on 10 byte floats. Recoding the FFT to do the shuffle in
the middle helps.

Would you elaborate on that?
I've done the trivial FFTs with the arrays of up to 64M points (complex
double) without any problems except it was very slow.


Vladimir Vassilevsky
DSP and Mixed Signal Consultant
www.abvolt.com
 
S

SoothSayer

Whatever dipshit.

You're an idiot.
All I know is I don't have to write complex code.

You do not even know how to write simple code, so you certainly have no
capacity for complex code.
The bios chip or the harddisk chip or whatever does that :)


You really are truly in the dark.
I only specify the sector number.
Wrong.

I don't have to specify the head, cylinder or whatever.

You do not get to specify any such things.
 
M

Michael A. Terrell

StickThatInYourPipeAndSmokeIt said:
On "brand new installs"? You're an idiot.


No he isn't. You're looking in that cracked mirror again, DIMBULB.
It was standard practice. Rip open the packaging on a new drive, and
copy of Windows. Install it, spend at least a half hour defraging that
new install, then install the drivers and box everything up for the
customer.


--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
 
Top