I have a 300MB file, a text file which is the output of a long SQL run. Years ago I remembered opening such huge files in Visual Studio or Slickedit without much trouble so I tried it - no way, it gave up after grabbing about 1.2GB of RAM and exhausting what was free.
OK, silly thing to do anyway - I got what I needed from the terminal using head and tail. But is this expected behavior? Can't it partially load files and load/dump as you scroll? Eating 4x file size in memory and still not having enough seems a bit excessive.
Not complaining, just curious.
-- Adam