2
$\begingroup$

Does anyone know a (free) method for importing large datasets into Google Colab, of multiple GB? Github is severely limited, uploading a folder to google drive takes a long time.

$\endgroup$
1
  • $\begingroup$ A better option will be to use wget command. I have discussed various approaches here. $\endgroup$ Commented Nov 2, 2020 at 14:33

1 Answer 1

2
$\begingroup$

One option is that you can download the dataset into your system and save it in an easily accessible directory. Then, run the following codes:

from google.colab import files 
data = files.upload() 

After running the above line, you will get a Choose File button where you can directly browse your system and choose your file.

Added the screenshot for your reference:

Screenshot

$\endgroup$
1
  • 1
    $\begingroup$ Unfortunately this method also takes a lot of time to upload. I have a 25mb file that takes about 10 minutes to upload, forget about uploading in GB's. $\endgroup$ Commented Dec 12, 2021 at 15:03

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.