I actually went through every post and manually copied out the relevant part of the html code. Then I pasted everything into my text editor (fun fact: vim got quite slow handling the >3mb html file, but emacs handled the task really well) and cleaned it up, replacing all
’s with
and such. Then I put all the pictures into a folder and changed the references to point to my local files. Then I put it into calibre to create the epub and mobi versions.
In retrospect, I should have just written a script to do all that because it took way too long. The script would have had to handle the different sites differently (especially the livejournal stuff is pretty messy), but it would have been so much faster. Like seriously.
Thanks! Downloaded; I don’t know whether I’ll actually read it (it being apparently over 476,000 words), but it’s great to have.
Did you use the method RicardoFonseca described?
I actually went through every post and manually copied out the relevant part of the html code. Then I pasted everything into my text editor (fun fact: vim got quite slow handling the >3mb html file, but emacs handled the task really well) and cleaned it up, replacing all
’s with
and such. Then I put all the pictures into a folder and changed the references to point to my local files. Then I put it into calibre to create the epub and mobi versions.
In retrospect, I should have just written a script to do all that because it took way too long. The script would have had to handle the different sites differently (especially the livejournal stuff is pretty messy), but it would have been so much faster. Like seriously.