On Thu, Aug 14, 2008 at 11:47 AM, Adam Eberbach aeberbach@mac.com wrote:
I have a 300MB file, a text file which is the output of a long SQL run. Years ago I remembered opening such huge files in Visual Studio or Slickedit without much trouble so I tried it - no way, it gave up after grabbing about 1.2GB of RAM and exhausting what was free.
OK, silly thing to do anyway - I got what I needed from the terminal using head and tail. But is this expected behavior? Can't it partially load files and load/dump as you scroll? Eating 4x file size in memory and still not having enough seems a bit excessive.
Not complaining, just curious.
In the meantime I have learnt that I may not open huge files (I do not complain about that - I still have vim), but there was one bigger problem. I often purely accidentally clicked on such a file while editing other files in the folder, and TextMate crashed. Well, maybe it did not really crash (I don't remember exactly), but it was definitely staled. So I had to force closing it, and lost all the other work in other windows. Even if I saved the files, I still lost the "open windows" and had to open and organize everything from scratch again. Which was pretty annoying. It would be nice if that could be fixed somehow, but I don't want to camplain too much as other features have priority.
One reason why I love Firefox is that despite its frequent crashes, it reopens all the windows that have been open before the crash when I start it again. That would be a nice feature in TextMate. Low priority, but welcome.
Mojca