| QUOTE (luther349 @ Apr 30 2004, 10:31 PM) |
the xbox has plenty of memery for a defrage program lol hell thers been defragmenters sence 386 days lol the good old dos command of defrage c: lol.
so xbox memery isnt a issue but rather the fatx file system take time to figure out and make a defragmenter. |
Your logic is flawed. 386's didn't have 100-200GB hard drives in them. Tens of megabytes takes much less memory to map than modern-day hard drives. The memory limitation on the Xbox is the problem nowadays.
Also, bradfordjohnson, I assume you meant "format" just to be safe. You would want to clean off the Xbox drive totally before reloading all of the data. Transferring it to your computer, defragging it there, then transferring it back to the Xbox would do nothing.
| QUOTE (bobdavis @ May 3 2004, 03:38 PM) |
| Tens of megabytes takes much less memory to map than modern-day hard drives. The memory limitation on the Xbox is the problem nowadays. |
unless i just completely dont understand defragmentation, memory isnt a big limitation. you could defrag it in sections, so it may not guarantee that f:\a.txt is physically next to f:\b.txt but it would guarantee that those individual files would not be slit up.
if you watch an old DOS defragmenter work, it doesn't swap very much to RAM. it works from the start of the disk to the end, and every time it hits a sector (might be wrong terminology, not sure) that's not in the right place, it moves it to a free sectore somewhere else on disk. It replaces it with the proper sector. There's some optimization in trying to order chunks that get moved and the such, but THAT is how deframentation works regardless of memory.
note, I never said it was fast..
:P
You kids R silly. It depends upon what kind of system we are talking about here. If you are talking about a standard over-the-counter xbox then you are talking a 9gig drive with 32mb of memory. 32 Mb of memory is plenty of room to defragment a drive. Seeing that you can set up 128k streams, it would not be that big of a deal to defragment the drive. The whole argument that 32MB is not enough memory... BAH... hogwash... the flawed response to the person who said the whole 386 thing was flawed... has a flaw...
:lol:
You see... 386 systems didn't have much memory... as a matter of fact, I remember having a 386 with a 300MB drive. My 386 had a Whole whopping 2MB of RAM... and I defrag'd my drive no problem. Now, you also forget extended memory issues in the old days... remember reaching past the 512k (or was it 1MB...beeen a LONG time ago) limitation? We didn't, and most programs didn't have access to large chunks of memory.
So, you can bet that a 300MB drive was being defragged in less than 512k memory.
Now for the fun part...
<_<
You see... you don't defrag a drive in MEMORY... you defrag a drive in a "virtual partition" you create. That is why most defrager software utilities now days require "X" amount of free disk space... they copy data sector by sector to their "virtual" drive... then they clean up that section of the drive... then they copy back those things that will fit back on the clean section... and repeat process.
The only memory being used is to organize sectors... and that can be done in a variable amount depending upon how much time and free memory you have available. If you are using a standard non-dev system with 32 MB of memory (minus about 2-3mb for overhead) you will end up with about 28mb available to your program. This is 28MB chunks of data to play around with... out of 9gig... that is around 322 chunks... of course it isn't really calculated like that... but the idea is..
You got plenty of memory... because you got the HARDDRIVE... you know paging to disk?
Good fun....
:jester:
Now, if we are talking development
There is source code available for ext2fs and minix defraggers. Might shed some light on the issue
No you are silly.
| QUOTE |
| If you are talking about a standard over-the-counter xbox then you are talking a 9gig drive with 32mb of memory. |
The standard over the counter xbox has either an 8 or 10 GB HDD and 64 MB of RAM.
Also, Im not sure exactly how a defragger works, or what kind of memory footprint it leaves, but with the xbox, there is no demand paging out of the box. Only recently have some highly skilled coders knocked up custom paging libraries. Does FBAXX have these routines? I can't remember, but I know that at least one of the arcade emulators does. If I remember correctly, the guys at avalaunch were working on this. I can understand why they would take their time because once you start messing with the file system, things can get really messy really fast. People ask for this tool, but as soon as someone loses all of their data, the person that made the tool would be crucified.
dankydoo
This post has been edited by dankydoo: Jun 10 2004, 04:29 PM
dankydoo, please, don't talk about defrag if you don't know how it works, the truth is that if you can actually use a harddrive in a system, then the system is capable of defrag that harddrive, faster or slower, but capable. I will not enter into details, but this is TRUE. To write a program to do this is another story
If Team AVA or someone else has not released a defrag utility yet, i suppose it is because it must requiere knowledge of how FatX filesystem works or something like that, and this project will not be much trial-error friendly
My vote is for "YES", A program like this would be really welcome and useful for me.
| QUOTE (Sauron-Jin @ Jun 20 2004, 12:31 AM) |
| dankydoo, please, don't talk about defrag if you don't know how it works, the truth is that if you can actually use a harddrive in a system, then the system is capable of defrag that harddrive, faster or slower, but capable. |
He still had a valid point to make, so what's wrong with his post.
and.. I really don't get why my earlier post doesn't effectively summarize how a defrag utility works.
| QUOTE |
it works from the start of the disk to the end, and every time it hits a sector (might be wrong terminology, not sure) that's not in the right place, it moves it to a free sectore somewhere else on disk. It replaces it with the proper sector. |
Darn I accidently clicked null vote, wasn't thinking. My answer is actually YES, so add one more.
my hard drive crashed on me a few days ago. could this have anything to do with a very cheap white label drive being too fragmented, and therefore stressing with the many seeking operations it had to perform over time? i dont know but id certainly like to overrule it just incase... I'd love a defragmenter for that reason.
same for a chkdisk
I know this is probably not possible... But could you map the xbox drives to your computer and try to defragment that way? I am too smart with stuff like this, but it seems like it might work.
| QUOTE (Sauron-Jin @ Jun 19 2004, 11:31 PM) |
dankydoo, please, don't talk about defrag if you don't know how it works, the truth is that if you can actually use a harddrive in a system, then the system is capable of defrag that harddrive, faster or slower, but capable. I will not enter into details, but this is TRUE. To write a program to do this is another story
If Team AVA or someone else has not released a defrag utility yet, i suppose it is because it must requiere knowledge of how FatX filesystem works or something like that, and this project will not be much trial-error friendly
My vote is for "YES", A program like this would be really welcome and useful for me. |
I never denied knowing how defrag works. I didn't spell it out because it can be readily found on the internet. Im only saying, that it is very complicated and many complications can arise. Why do you think there isn't already a defragger? Not because it is easy, but because it can ruin the data on a drive. If anyone can do it, Im sure that team Avalaunch can, and they have stated repeatedly that they are working on it, they are highly skilled and very knowledgable about many different things. I also was saying that it can't be partially complete or experimental because if people start losing data, they will start to complain and accuse the writer of the defragger of not paying attention to this or that. Would you wanna be the person getting flamed for that??? I don't think so.
Dude that's the most fragmented drive I've ever seen. Where is the 19% free space?
Hello,
Im guessing you guys dont understand how it all works.
First of all this is how a Drive becomes Fragmented.
A - File 1 (3 Bytes)
B - File 2 (8 Bytes)
C - File 3 (6 Bytes)
D - File 4 (6 Bytes)
E - File 5 (12 Bytes)
DOTS ... << Free Space
Now lets say we have a Brand New HardDrive and we wrote the files to the Hard Drive in order, so it looks like this
AAABBBBBBBBCCCCCCDDDDDD..................
Ok, so now you delete File 2 which is letter "B", so now the Hard Drive Looks like this
AAA........CCCCCCDDDDDD..................
Now everything is still good and Dandy, but lets say we add File 5 (which will be represented by letter "E", which has 12 Bytes).
Ok so what the Xbox will do is add the First 8 Bytes in the Open place off the Hard Drive and the rest at the next aviable space. So it would look like this
AAAEEEEEEEECCCCCCDDDDDDEEEE..............
Ok, Now File 5 ("E") Is Fragmented into 2 Parts. So now the Hard Drive would have to read the First Part then the Second Part. Now the more Fragmented Parts of a file the longer it would take to read that File.
So now the Defragger Program goes in and ReArragnes the Fragmented File(s). Ok now this part I guess seems Pretty Simple. But here comes the interesting Part. Each and Every Hard Drive with a File System has something called the Master File Table (MFT).
Now this MFT keeps a Record of ALL the files on the Hard Drive, it also contains Files Offset and Lengths and Read/Write Stuff. Not sure how the FATX File System works but as far as NTFS whenever you first format a Hard Drive the MFT gets Reserved 13.5% of the Paritions Size (Just to give you an Idea 13.5% from a 120gb Hard Drive would be 13.5gb). Now If you have more Files then the 13.5% of Paritions Space can hold the MTF becomes Fragmented too. This is where it all becomes super tricky because you have to know where to Read/Write from and if you messed up once then pretty much the hole Hard Drive can be Reformatted.
Ok with all that said this will give you a pretty good idea of how it all works. Just Storing the MTF would be hard and then managing each Seperate File, its just crazy.
If I'm wrong please feel free to correct me!
BlueCELL

I could go into more Detail about the Clusters and Everything but I dont have the time right now, maybe later...If yall have any more questions please feel free to ask!
BlueCELL
If your HDD is really fragmented, do as previously suggested...copy the contents of your drive to another machine. Delete them from the Xbox hard drive. Copy the files back to the Xbox hard drive. Hell; that's how we had to (still have to?) do it with a Mac.
Other than that...
Wait for a defragmenter to come out.
As for claims that defragmentation may not speed things up; that may or may not be true. It depends on quite a few things...size and speed of the drive are very important. People have claimed that 7200 RPM HDDs have little if any speed difference based on fragmentation level. I don't exactly buy into that; I still defrag my SCSI drives, and they're much quicker than any IDE HDD out there. Also, whoever made that claim; your drive is NTFS, not FAT X. NTFS is quite a bit more complicated, and as such, has some ways to manage file loading speed issues and the like.
| QUOTE (Monoxboogie @ Jul 15 2004, 11:12 PM) |
If your HDD is really fragmented, do as previously suggested...copy the contents of your drive to another machine. Delete them from the Xbox hard drive. Copy the files back to the Xbox hard drive. Hell; that's how we had to (still have to?) do it with a Mac.
Other than that...
Wait for a defragmenter to come out.
As for claims that defragmentation may not speed things up; that may or may not be true. It depends on quite a few things...size and speed of the drive are very important. People have claimed that 7200 RPM HDDs have little if any speed difference based on fragmentation level. I don't exactly buy into that; I still defrag my SCSI drives, and they're much quicker than any IDE HDD out there. Also, whoever made that claim; your drive is NTFS, not FAT X. NTFS is quite a bit more complicated, and as such, has some ways to manage file loading speed issues and the like. |
good post,
i would have expected one to be out to be used with a linux distro as more and more peeps are getting big hds and filling it up so needs to be defragmented to let it run more smoothly.
Antiflag....Why dont Linux need to be Defragged? I cant argue w/ you b/c I dont know jack about there FS.
BlueCELL
Couldn't the X, Y and Z drives be used as storage for the data that was being re-arranged/sorted, along with the memory of the Xbox, wouldn't that be enough space for a defragger to run?
I'm not assuming that it's that simple, otherwise we'd have a working defragger by now, but couldn't using the temp drives of the Xbox make it a lot easier to do (defrag).
Not exactly sure how big the XYZ Drives are but there not over 1 GB and even if they are it would still be to small. Because think about it, todays huge 120-200+ GB harddrives..64MB of Ram and few Gb of XYZ.
I mean yes it would help but alot of work.
BlueCELL
| QUOTE (BlueCELL @ Jul 31 2004, 05:16 PM) |
Not exactly sure how big the XYZ Drives are but there not over 1 GB and even if they are it would still be to small. Because think about it, todays huge 120-200+ GB harddrives..64MB of Ram and few Gb of XYZ.
I mean yes it would help but alot of work.
BlueCELL |
I'm pretty sure the X,Y and Z drives are 768MB each, which would give us over 2GB of "swap file" and 64MB of RAM. How much space is really needed?
I figure that the amount of space probably isn't a percentage thing because only so much data can be sorted at once anyway, so even with a 200GB HD, if the Xbox copies files from all over the HD (fragmented) into the XYZ drives in order then puts them at the beginning of the HD and so on, would it be feasible?
I guess that's what defragging is anyway, but how long would it take?
Thanks for your input BlueCELL.
BYEEEEE!!
Here is a easy way if you dont find a defrager..
Copy all the files you want to a pc.
Then reformat the partion that needs defraging,
Copy the files you want on pc over to xbox.
It may take a lot of time and work but it will get the job done :-)
No offense dude, but that's blatantly obvious.
Just a note: Defragging isn't just for the performance of your drive (loading times etc), it can
a) be helpful in games that stream data from the drive as you play as if it was streaming from the DVDROM, causing stuttering.
b) increase the lifetime of your HD because the little arms that read the data don't have to jump around the disk surface nearly as much, causing a lessened strain on the drive.
There are two of the big reasons for defragging. I personally would like to see a defragger, and if I was still in college and had free time I may have just made one myself, but I just don't have time these days.
| QUOTE (dracflamloc @ Aug 22 2004, 04:39 PM) |
Just a note: Defragging isn't just for the performance of your drive (loading times etc), it can
a) be helpful in games that stream data from the drive as you play as if it was streaming from the DVDROM, causing stuttering.
increase the lifetime of your HD because the little arms that read the data don't have to jump around the disk surface nearly as much, causing a lessened strain on the drive.
There are two of the big reasons for defragging. I personally would like to see a defragger, and if I was still in college and had free time I may have just made one myself, but I just don't have time these days. |
Seeing as this thread keeps getting bumped anyways (votes that is), I guess I'll reply to your points.
1) The xbox dvd drive reads on the usable outer edge at a maximum of 6.4MB/s with an average seek time of 130ms meanwhile the hd has 12Mb/s minimum with 10ms average seek time. Even a very fragmented drive has a major advantage still.
2) Fact of the matter is games move things to the cache partition they're using if they're going to be constantly accessed the majority of the time anyways, so head movements aren't really a major issue in that sense. The fragmented file is going to get copied to X/Y/Z and won't be fragmented for the majority of the reads then.
A more serious issue in terms of the drive's life would be the small enclosed area the drive is at. The heat is going to have negative effects on the hard drive far before the head has issues reading I'm sure. Though drives sure can handle a good amount of heat.
In the end most people are likely going to get data corruption far before the mechanical parts break down.
Its not a problem with Power lose causing Data Corruption, this problem could be easily fixed by writing a file which tells us which sectors were transfered where. The main issue i see would the the amount of time it would take. Because i know it would take more than 6+ hrs and my parents dont want my shit running when i dont use it (Power saving).
Essentially, it would be a plus to add an HD defragger to xboxhdm... Obviously, if you really wanted to defrag the drive, you'd need to rip the drive out, but....
Seeing as linux fatx support is great, an xbe based on linux would work...
To make a Virtual Memory area for copying files, the cache partitions could be grouped into a virtual hard drive, and a file could be made that spans the three partitions. The defrag program could initialize this file as a temporary storage device.
Technically speaking, I believe a modified version of linux would be able to accomplish this without the need to rip out the drive and stick it in a pc....
Well, instead of starting out with a full defragger, why dont we just try something a little easier with fewer risks. Here's what I'm thinking:
Instead of having it move clusters at a time, have it move entire files. It could move them to the virtual drive that could be made by combining XYZ and since it would be done this way each file would move fairly fast.
After moving a certain preset number of files (or a limit could be set to a total filesize of files moved) the files could be moved back one at a time, ensuring that each file is esentially defragged.
Since each file is moved individually, data corruption could be a bigger problem then with a real defragger. I'm still trying to figure out a way to work around this. I remeber somebody mentioned just to write a logfile but they also mentioned using the XYZ drives as one large drive. Yea, well the problem is that as soon as th power flickers or the xbox shuts off, those drives are going to be split again (as they were only "virtually" together). This means there is also a very good chance the xbox may decide to format these drives upon reboot. If i remember correctly, it clears them when they reach a certain capacity and since wed want to use them to a full potential, this could be a problem when the power flickers and somebody loses 2 gigs of files (but this is why I recommend using a file limit, say 10 to 20 files per cycle)
The cons would be a filesize limit of 2 gigs. Since we'd be moving entire files, anything bigger than 2 gigs would be TOO big, therefore it couldnt be moved. But I start thinking about it and I don't think anybody should have 2 gig files on their HD's anyways. Movies would be the only exception but this is why I prefer Divx.
Again, power flashes mean bad mojo but since we want to use the cache drives, there is no way around this. Not even a logfile could save us now...
Other than that I think this would be a great start. Post your thoughts and lets see what we can do to make this work. Ill try to answer any questions you have about this too.
Moving a few files around from partition A to partition B and back isn't going to help a fragmented drive. For every file FATX will simply start filling the first unused sector on the disc and move on to the next (most likely resulting in more fragmentation). Defragging must be done on a sector level.
Like said, you can move your stuff to a PC and back by FTP, this will result in a 100% unfragmented drive, but it only works if you move _everything_ on a partition.
The other way is with a defrag algorithm, which will require ~20% of space to be unused on a partition and a vast amount of time. It also requires a skilled programmer with experience in filesystems who certainly isn't following this thread and rather spends his/her time on writing useful stuff for things that really are needed.
The Xbox filesystem simply isn't meant to contain large amounts of data. Ppl that are concerned with fragmentation should consider having their media on a central fileserver with a proper FS and using the Xbox only as a front-end. There are excellent game managers available and XBMC fully supports samba.
Now please let this thread rest in peace in the depths of the Xbox-Scene forums.
Well, my 120 gb (yea i guess low compared to ur 400's) never slowed down....
always fast, and ive been deleting ftping, erasing, changing, things ALL the time, never a slowdown..
(Storage reasons)
Then when I just want to clean up I export my large files and place the contents of my image file to the xbox hdd.
Point is something like a Xbox Ghost would be cool. Defrags are great but I like a new pair of shoes.
geek out one yourself with Slayers' Installer... That's what I did and it works PERFECTLY
Before I had my account opened people were asking for it, and yet still nothing.
Until you find a developer with the skill to do it, and feels there is enough reward for it, you're not going to find the tool. Interesting how the xbox-linux people still don't feel it is very critical also...
Well, I am new to the Xbox modding scene but I am an old timer in regards to computers. Anyways, an Xbox defragmenter will become essential (most people just havent realized it yet because they dont have large HDD in their Xbox). When more people start installing large hard drives (160GB+) us current large drive owners with be saying their I told you so remarks. Anyone who says otherwise is an idiot, it is proven that defragmentation is necessary to maintain performance and anyone who says otherwise in regards to a PC or an Xbox just doesnt know what they are talking about. I just installed a brand new WD 250GB hard drive and I have definitely seen a noticeable slow down in load times as I make changes (aka copy new files, delete old, move, rename, rearrange). If you ask me, we should have had this from the beginning of the mod scene, right after Team EvolutionX (pretty sure it was them anyways) gave us the ability to upgrade the hard drive. It is a given that we will need it, however, when the developers decide to get off their ass and begin work on one is another story (I know this is an open-source community and most the developers get nothing for their work other then kind recognition but it has been to long and nothing has been done). If Team Avalaunch is truly working on this type of functionality then more power to them (I only hope they release it as an app rather then embedded into their dashboard because I like EvolutionX better) as they have the vision and logic to see that this up taking is a necessary evil.
Slice
P.S. And to anyone who hasnt figured it out already in regards to hardware limitations. THERE IS NONE! The Xbox has plenty of resources (processing power, memory, swap file space, etc.) to be able to defrag a hard drive; it is just a matter of the developers creating one.
Instead of telling others to get off thier ass, why dont you start learning the necessary skills to code one up if its that important to you? If it takes you a year or 2 to do it, well hey, its a year or two sooner than if you do nothing. But your an old timer when it comes to computers, and since the "Xbox is basically a PC" it shouldnt take you a year or 2, should it?
Ive been running a 120GB WB HDD in my first modded Xbox for over a year now, its at about 20 months, and honestly, I cant tell a lick a difference in any loading times from any games. The first game ripped to that HDD, Halo, is still on there, never erased, and it loads the same now as it did 20 months ago when I installed it.
I dont see the need for one myself, and it looks like most of those in scene feel the same way, as if it were that needed/wanted we'd have it. Look at the XBMC team asking for bounties and coding beginning on DVD+Menus playback after sooo many in the community have demanded it. Take that as an example, start a fund, elsewhere of course, not on XBox-scene.com's forums, for a Defragger if you cant take the time yourself to learn the skills to code on up yourself.
| QUOTE (SigTom @ Dec 8 2004, 04:06 AM) |
Instead of telling others to get off thier ass, why dont you start learning the necessary skills to code one up if its that important to you? If it takes you a year or 2 to do it, well hey, its a year or two sooner than if you do nothing. But your an old timer when it comes to computers, and since the "Xbox is basically a PC" it shouldnt take you a year or 2, should it?
|
Note: I'm a hardware junky, NOT a programmer.
| QUOTE (SigTom @ Dec 8 2004, 04:06 AM) |
I dont see the need for one myself, and it looks like most of those in scene feel the same way, as if it were that needed/wanted we'd have it. Look at the XBMC team asking for bounties and coding beginning on DVD+Menus playback after sooo many in the community have demanded it. |
Look at the polls dude; the majority of participants demand it! Scale those results up and Id say the majority of the scene wants it.
Slice
It may be wanted, but like I said, I dont think its needed. No noticeable slowdown on Halo loading from the same HDD over 20 months old. Use the same Xbox to play at least 20+ episodes/week of different TV shows, FTPing them on and off repeatedly for at least the past 10 months, along with my music collection, which isnt the largest, but still 200+ songs is a good amount. If it were such an issue, dont you think this HDD would show some effect of fragementation? Wouldnt at least one of the numerous games I play show some slowdown? Only about 10 or so games I play on montly basis and I havent been able to tell a difference on any of them.
Since your already half way there, a hardware junkie, make the leap to software an get a coding. Take the same energy you spend into learning and keeping up to date with hardware, and apply some of it to software. You could even collaborate with at least one or 2 others in this thread, if not more of the numerous people who have demanded it. There HAS to be enought brain power between enough of you who want it to do it, right?
| QUOTE (charlesnelson9 @ Dec 11 2004, 12:36 AM) |
| I have a 160GB hard drive in my xbox. I'm always reorganizing and moving data to and from the xbox. A few months ago my hard drive started making loud noises while playing and now it's so bad that the xbox randomly freezes up. I know that it's the hard drive that making it freeze because when I put my original hard drive back in to play on xbox live it never freezes. |
If your hard drive is making loud noises you have far more than just a defragger to worry about.
Besides that, you realize the main type of file that gets fragmented on your normal hard drive use right? Concurrent writes between multiple files at the same time. Good thing for us we don't really ever have to do that. The main issue of fragmentation for us most of the time will be a few files between the size difference of two games after you deleted it, assuming FATX will use the freshly freed space. Even then the odds of fragmentation of a single file coming anywhere near what it can easily reach on a PC are very slim, due to the single progam nature of the xbox.
In the end, if fragmentation was such a large issue as very few of you are making it out to be, we'd have a far lager percentage of our userbase complaining about problems a few months down the line, however such is not the case.
As it stands, my 250GB drive is having zero issues, and as others have put and I have tried in the past, developers do not see a real reason for it. And the majority do not 'demand' it, they're just saying sure they'd use it most likely. Though if there was any amount of risk of data loss with it I doubt any of you would even think twice of using it. Or after notice not much of any changes at all in speed of running afterwards, I doubt they'd use it a second time.
Good luck with it, merely wanting isn't enough to convince people of the merits of doing something, I mean almost everyone new to the scene wants a ps2 emulator, but again there is no real purpose to that on the xbox also.
Well ok..since I'm not a programmer and, do not understand the FATX file system...would it be possible to just transfer my C,E,F and G directory onto my pc, reformat the xbox hard drive and then transfer back over? or, would those files I just transferred to the PC already be fragmented thus, making it a useless idea?
I disagree with those who believe it does not matter (having a defrag tool or not) as, the more fragmented a drive gets, the harder your HD has to look for it which, (as far as PC goes) is actually harder on your hard drive and causes more grinding noise. Which, over time...is bad for your HD and, can decrease the longevity of it.
| QUOTE (ub2slw @ Dec 15 2004, 04:50 AM) |
| Then again, the seek times may be negligible on a 120gb since it is so small. But Im confident if you had a bench tester youd see a difference. |
As platter density increases, average seek time doesn't increase. It actually takes less time to seek more data.
You know the place to start would be a tool to retrieve file information on where the data really is. Forinstance, say you have BGM.AFS which you stream music from. This is not something to haev horribly fragmented. So it'd be nice to be able to select the file in a program and get details and have it tell you how many fragments make up the file. Atleast then you could really see if your important files are fragmented. I for one don't think it'd that big of an issue except on streaming data.
I don't have an upgraded HD, not that I wouldn't want one. But I have a DVD writer and prefer optical backups rather than magnetism because HDs are doomed to fail after a good amount of use. Optical discs ussually last much longer. Longer load times perhaps, but hey, I'm in no hurry, I can spare a few minutes.
| QUOTE (Mage @ Dec 15 2004, 01:27 AM) |
| As platter density increases, average seek time doesn't increase. It actually takes less time to seek more data. |
huh??? files that are at opposite ends of the Platter\Drive are going to take longer to find on a 400gb vs a 120gb
| QUOTE (ub2slw @ Dec 17 2004, 08:35 AM) |
| huh??? files that are at opposite ends of the Platter\Drive are going to take longer to find on a 400gb vs a 120gb |
Average seek time on a larger drive is lower than on a smaller drive assuming same RPM.
For example with seagate, from 200GB up the average is decreased to 8ms instead of 8.5ms.
http://seagate.com/c...dex/1,,,00.htmlThe platter isn't larger, with seek time is still only dependant on RPM. So seeking from one side to the other will not take longer. However with the increased densitity you read a larger chunk of data in the same period of time.

is it needed? only if you've painted yourself into a corner. you should be able to backup your apps to your PC and format the xhdd. then move your apps back over. no room to backup your games and movies to your PC? well, you should have the originals on cd/dvd somewhere. piracy is bad right?
but let's say for whatever reason you don't have the original discs and need to backup your games and movies to your PC too...but you have a 300gb xhdd full of games and movies and a 120gb hdd in your PC. that is what i call painting yourself into a corner. why did you put the bigger hdd in your xbox to begin with? that is stupid. why is it stupid? um, fragmentation and backups...the problems you are facing. of course, if you go buy another 300gb drive to stick in your PC and use it for backing up your xbox, but damn, will you buy me one too, daddy warbucks?
every partition on the xbox (except C) should be treated as a cache drive, not an archive. if you want to archive 100s of gbs of data you should do it on the PC where there is better support for such things. of course, i know my logic on this is circular. as soon as someone develops an xbox defragger, you will be somewhat justified in this insane practice. i know i've upset alot people with the 'stupid' word. i do stupid things too, but i don't expect someone else to save me from the mistakes of my stupidity. i correct the matters myself. who in there right mind would dig a pond in the desert and then get mad because the people in the nearby town didn't make the rain happen to fill it up with water? let the flaming begin! 
here's something to waste time pondering. which would take longer:
1. a defragger running on xbox with 300gb hdd 80% full
2. backing up those files to PC, formatting xbox hdd, putting files back
but of course it would be great to have one (as long as it didn't have bugs)
I've seen benchmarks of 3 game installs, and game loads that showed a new ATA133 80-pin cable didn't improve performance in the least.
And I'm pretty sure XBox is simply ATA33. I wonder if ATA66 or higher modes could be unlocked with a BIOS, but I don't think so.
I know this is an old thread but I came across it after suspecting my hard drive being slow espically during FTP sessions. From what I have read (and please correct me if I am wrong, I could be way off base). The xbox will only writes one thing at a time (versus a PC where you can have mutiple programs all right to the HDD at the same time) and it should also write files in a logical order, starting at the beginning of the drive and moving forward, now my question is if you delete data from the start of your drive, next time you write to it will it use that free space, or the the "virgin" free space at the end of the last data written. If it is the later you should need to defrag at all until you have copied enuf data to fill the HDD. If it is the first, how smart is the FATX file system, will it try to squeeze a file in to a space that doesnt fit and have it get fragmented or will it look for free space elsewhere then come back and fill the space with a smaller file that fits. On another note maybe programmers should conside making a self defragging filesystem, i.e when you delete data at the beginning of the drive it shuffles everything down, to use a previously used illustration:
AAABBBBCCCCC.........
Delete AAA and write DDDDDD
instead of getting DDDBBBBCCCCCDDD...... you would get BBBBCCCCCDDDDDD
I know that wouldnt work in an environment where you delete and write a lot of small pieces of data frquently you could end up moving GBs worth of data just for removing a Kbytes. I think this would have more use for consumer electronics, i.e. xbox, DVRs etc. Anyways Im sure this idea flawed or it would have been done by now.
QUOTE(jakejm79 @ Nov 27 2006, 11:47 PM)

If it is the first, how smart is the FATX file system, will it try to squeeze a file in to a space that doesnt fit and have it get fragmented or will it look for free space elsewhere then come back and fill the space with a smaller file that fits.
All FAT implementations in the real world, including the Xbox's FATX, nice though it would be for them to do something more intelligent, basically just pick the earliest available free space on the disk every time they need to allocate a block, which leads automatically to large amounts of fragmentation on any disk where blocks are allocated and freed frequently. So, yes, it really is that dumb.
QUOTE
On another note maybe programmers should conside making a self defragging filesystem, i.e when you delete data at the beginning of the drive it shuffles everything down, to use a previously used illustration:
AAABBBBCCCCC.........
Delete AAA and write DDDDDD
instead of getting DDDBBBBCCCCCDDD...... you would get BBBBCCCCCDDDDDD
I know that wouldnt work in an environment where you delete and write a lot of small pieces of data frquently you could end up moving GBs worth of data just for removing a Kbytes. I think this would have more use for consumer electronics, i.e. xbox, DVRs etc. Anyways Im sure this idea flawed or it would have been done by now.
This would be earth-shatteringly slow and also pose crash-recovery problems. The Xbox as it comes at retail doesn't really need any special consideration given to fragmentation given that the only data that gets rewritten is TDATA/UDATA, which usually contain small files that are not performance critical (namely, savegames). DVRs tend to use preallocating filesystems which find free spaces large enough to store the entire file they are about to record when possible.
Many filesystems used on UNIX systems don't suffer from fragmentation in any meaningful way - ext2/3, reiserfs, xfs, jfs are all relatively unaffected. They use various clever techniques to ensure this which you can read about if you Google, I'm too lazy to explain in any detail.
Disappointingly, NTFS, despite being a modern sensible filesystem from many points of view (arbitrary metadata streams, ACLs, transparent compression/encryption, properly journalled for crash recovery, etc) suffers quite badly from fragmentation as well, as it also tends to make exceptionally stupid decisions about where to allocate new blocks.
QUOTE(hargle @ Nov 30 2006, 03:47 PM)

Is defragging really that much of a deal anymore? I totally understood its use back in the MFM hard drive days when seek times were incredibly slow, but today's uber ninja fast hard drives (not that the xbox really has such a device) have seek times that are incredibly fast.
Sure accesses will be faster, but what's the difference in human time-frame by reducing seek times from 17ms down to 8ms? I think I can wait, but then I'm old and slow myself.

Ah, but modern hard disks tend to have very large caches which prefetch sectors when there's nothing else to do - if you read sector X, the cache is likely to end up holding sectors X+1, X+2, X+3 etc as well. If this all represents contiguous parts of the same file (no fragmentation) then this is likely to be useful, as there's a good chance that the next part of that file will be read next. However, if the disk is heavily fragmented most of this data is useless and will not be used.
Reading data from the cache can be a good four times faster than reading from the disk, even totally discounting seek times and assuming the head is always in the correct position - the cache can effectively push out data at whatever the speed of the interface is.
The same effect repeats itself at a higher level - the OS's buffer cache is also likely to readahead in files which look like they are being accessed linearly, and this is much faster if the data is contiguous.
So, yah. Seek times for reading one sector are not very interesting from a fragmentation POV - it's the overall effect of the system that you have to look at. You rarely access the disk just to read one sector, after all