Hello, /g/. and merry christmas. i have a question for athe programmers. i want to create program (or hire somebody to) that will continiously download the content from two sites and store them. i need them to keep updated for i fear losing these sites in the years im gonna be away soon .
so to make a long story short how would i go about this?
Use the general
Mostly solid advice. Here's another one
op here. the general is not being much help. coul dsomeone please aid me in this.
>>52061002
rent an amazon ec2 instance, and use it to scrape the sites in question periodically and compress the received files? What on earth could you need to do this for, and why are you going to jail?
>>52061021
not goiing to jail, but between joining the military and how i intend to life having done so i dont expect to have access access beeing away from things for that long scares me and makes feel as though certain things could easily be gone and that is unacceptable.
>>52061021
but, alright scrape the sites and compress the files. how would i do that and could have the files saved to a storage device of mine or would it be more like store in a cloud type thing if i did it this way?
at the very least i would appreciate beeing pointed in the direction of tutorials or something of the sort.
>>52061196
check scrapy if you're into python
the things im trying to gain from this are written works posted on these sites and i want them as either epub or pdf format. would that be possible with the previously described methods.
>>52061134
rent a server, make a small script to wget the website every x days, then tar it up. Pay some moron to check on the server every y days.
>>52059782
>>52060366
MOAR!!!