S
Skybuck Flying
RoyalFucks welcome to my ban list.
Bye,
Skybuck.
Yeahahahaha.
Bye,
Skybuck.
Yeahahahaha.
Skybuck said:Nonsense.
Load whatever you need into RAM and proceed from there.
That's what RAM is for.
I have 2 and even 4 GB of RAM.
I have yet to encounter a situation where the harddisk is a real bottleneck.
Bye,
Skybuck.
Skybuck said:For a harddrive manufacturer it would make sense to optimize for 0 to N-1,
to keep things simple for programmers.
You agree with that ?
I think they do that.
So file system should do that too.
<contexted snipped>
If you need it read previous ones.
Because I don't need it, because what you wrote is BS.
Bye,
Skybuck.
Skybuck said:Since windows is closed source how are you gonna to proof your claims ?
One could analyze the binary data but this still requires knowledge of the
file system layout and such.
Also I wouldn't be too surprised if windows xp x64 does a little back of
defragging in the background ?
Finally I have a very large harddisk and try to delete as few things as
possible to prevent any fragmentation from occuring.
StickThatInYourPipeAndSmokeIt said:Horseshit. The only way a file gets fragmented is if it gets edited
since the original write and after other files have been written after
the original write. NO OTHER WAY.
That said, only dopes that want to endlessly thrash their drive
constantly defrag.
I defrag less than once a year. I have several partitions.
System screams right along.
Over-maintenance is idiocy.
FAT and FAT32 FS are accessed the same way they were in the DOS days.
NTFS is a bit more OS managed, but not much.
FAT can frag up. NTFS doesn't frag much at all.
Your knowledge of the subject is fragmented.
StickThatInYourPipeAndSmokeIt said:On a nearly full, already heavily fragmented drive, yes.
On any drive with huge amounts of space left on them, particularly if
it has never yet been written to, the file write will ALWAYS be
contiguous and monolithic in nature. Also, constantly "defragged" drives
will nearly always have their free space "defragged" as well, and that
will result in several GB of new file writes, of which all will be
contiguous.
Very large files will not obey the rule as much, but a few pieces of a
for the most part contiguous huge file will not take a hit for it.
A database file is the most likely candidate for fragmentation as any
changes made to the database results in a file fragment which will not be
contiguous with the rest of the file.
Another is if one is DLing two files at once to the same partition.
So, it doesn't happen "all the time". It happens when you volume is
nearly full already, and already has a highly fragmented fill on it.
If one's drive has never been filled since its original format, a new
file write will always fill a new, contiguous space on the volume.
Behehehe is that all.
LOL.
Get yourself:
Windows XP X64 Pro Edition.
Use a 64 bit compiler or simply 64 bit integers.
Let the virtual memory do it's working = paging.
Wrong. FAT and NTFS write fragmented files on the very first write.
Too bad you cannot find the truth. Dimbulb / AlwaysWrong
Horseshit, i have repeated seen fragmented files on brand new installs
of XP, and 98. I have been told that there are the same issues with MS
NT 2000 and server 2003. Why do you insist on ignorant untruths?
The days of the bios and complicated hd reading/writing are long gone !
You learned more from me in this thread then you will ever learn in any
college out there !
If only that were really true. Now **** OFF AND DIE!Bye,
Skybuck.
Since when is documentation considered an algorithm ?
Show me the algorithm or show me the code or shut up
The virtual memory system in Windows thrashes when you try to do a
large FFT on 10 byte floats. Recoding the FFT to do the shuffle in
the middle helps.
Whatever dipshit.
All I know is I don't have to write complex code.
The bios chip or the harddisk chip or whatever does that
I only specify the sector number.
Wrong.
I don't have to specify the head, cylinder or whatever.
StickThatInYourPipeAndSmokeIt said:On "brand new installs"? You're an idiot.