Sunday, February 14, 2010

RapidShare Premium, wget and your download queue

So you just bought a RapidShare Premium account and would like to automatize fetching your link list? Here is a very raw solution with cookie-based authentication handled by wget and a download queue in the form of a plain text file.

An overview

The idea for getting your download queue with wget, using your RapidShare Premium account, is very simple. Here are the main steps:

  • Enable direct downloads from your RapidShare profile;
  • Make wget save your RapidShare cookie;
  • Make wget use the cookie to fetch a link;
  • Paste all your links in a plain text file;
  • Write a basic bash script that reads the said plain text file – your "download queue".

The step-by-step guide

1. Log into your RapidShare Premium account and go to "Settings". Under the "Configuration" paragraph, you need to tick the "Direct downloads" check box and then click the "Save" button. The direct downloads option means that you will not be shown the free-vs-premium download page each time you open a RapidShare link. You will get the "save as" dialog instead.

2. You need to make wget to store your RapidShare cookie.

wget 
--save-cookies ~/.rs \
--post-data "login=YOUR_RS_USERNAME&password=YOUR_RS_PASSWORD" \
-O - \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi

You obviously need to replace YOUR_RS_USERNAME with your RapidShare user name and YOUR_RS_PASSWORD with the password associated to your Premium account. The cookie will be saved in a new file called .rs, in your home directory. It is now time to adjust the permissions on that file:

chmod 400 ~/.rs

There is one final thing to do here: suppose somebody browses through your command history. You would not want that. So take a look with

history | tail

You will probably see an output such as:

503 ls
504 wget --save-cookies ~/.rs --post-data "login=YOUR_RS_USERNAME&password=YOUR_RS_PASSWORD" -O - https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi
505 chmod 400 ~/.rs
506 history | tail

Notice the "offset" (the number associated to the wget command) – in our example, it is 504. You will just have to delete the command at this offset from your bash history:

history -d 504

3. Now that your RapidShare cookie is saved, you can instruct wget to use it. Here is how you can download a single file using your RapidShare Premium account and wget:

wget --load-cookies ~/.rs URL

(You need to replace URL above with an actual valid RapidShare link.)

4. It is time to build your "download queue". Just paste the RapidShare links you want to download in a plain text file, one URL per line. Your file might be called rs_download_queue.txt and it might look a bit like this:

http://rapidshare.com/files/123456781/archive.part1.rar
http://rapidshare.com/files/123456782/archive.part2.rar
http://rapidshare.com/files/123456783/archive.part3.rar

5. The final step is to wrap up the whole thing and start downloading without further interaction. A very raw solution would be to write a script like this one:

#!/bin/bash

dq="/path/to/rs_download_queue.txt" # your "download queue"
ck="/home/yourusername/.rs" # your RS cookie from step 2

for line in `seq "$( wc -l "$dq" | awk '{ print $1 }' )"`
do
URL="$( sed -n ''$line'p' "$dq" )"
URL="$( echo "$URL" | sed -e 's/^[ \t]*//;s/[ \t]*$//' )"
if [ "x${URL}" != "x" ]
then
wget --load-cookies "$ck" "$URL"
fi
done

Save this script as rsdq.sh. Do not forget to make it executable:

chmod 755 rsdq.sh

So each time you need to download something, you just have to edit the rs_download_queue.txt file accordingly and then launch the script rsdq.sh that you just wrote.

Caveats

As stated at the beginning of this post, this is a very unsophisticated solution. Since for me it works wonders, I will not bother to improve it, but it is only fair to let you know that:

  • The script does not check for dead links.
  • The script provides no way to resume an interrupted download.
  • The script does not check for the space you have left on the device.
  • The script does not check whether you still have RapidShare traffic left for the current day.
  • The script does not parse the "download queue" file in any way other than skip empty lines.

As I said, it works fine for me – as long as I can use it during the RapidShare Happy Hours (when your traffic gets counted only as 10% of the real one), it's enough.

5 comments:

  1. Thanks. Nice work.

    ReplyDelete
  2. new rapidshare API

    Wget option:

    first "read" cookie

    $ wget -qO- 'https://api.rapidshare.com/cgi-bin/rsapi.cgi?sub=getaccountdetails_v1&withcookie=1&type=prem&login=USERNAME&password=PASSWORD' | grep cookie | cut -d'=' -f2

    and you will get something like this:
    AJSLDKAJSDLAKS10923EKJSDQA09128731KN23JK123097K1JL2H3KL1

    then use:

    $ wget --nocookies --header="Cookie: enc=YOURCOOKIE"

    :)

    ReplyDelete
  3. $ wget --no-cookies --header="Cookie: enc=YOURCOOKIE"

    ReplyDelete
  4. wget --no-cookies --header="Cookie: enc=YOURCOOKIE" URL

    ReplyDelete
  5. Good to know, thank you. I'll keep that in mind... for when I subscribe again to a Rapidshare account. :-)

    ReplyDelete