I am not able to download data from FTP urls from the GFDL Data portal homepage, by clicking the url or using wget. How do I work around this?
FTP is not supported by most browsers presently. We are also experiencing issues with wget configurations at this time. Since we are transitioning to an alternate mechanism for better data downloads, please use the temporary workaround at this time to download the data by using FTP clients of your choice (e.g lsftp)
Example-1
Here is a basic example using ftp.
Question: This ftp url (just for demomnstration) ftp://nomads.gfdl.noaa.gov/users/Ming.Zhao/AM4Documentation/GFDL-AM4.0/inputData does not open for me. How to work around this?
Step1:
From your terminal, cd into a directory where you'd like your output to be saved.
Then do the following:
ftp nomads.gfdl.noaa.gov
Step2:
Specify
anonymous for user name and password.
Step 3: CD into the data directory (anything after ftp://nomads.gfdl.noaa.gov/) based on the FTP url you originally were trying to use.
cd users/Ming.Zhao/AM4Documentation/GFDL-AM4.0/inputData
Step 4: Use ls and get commands to list and download the necessary files.
E.g get filename
You'll see the downloaded files in your current working directory.
Example-2
Here's another example using lftp.
(Please refer to your OS installation docs to install lftp. Please note that you can use try any client of your preference, GUI based as well)
lftp nomads.gfdl.noaa.gov
login anonymous anonymous
cd users/Ming.Zhao/AM4Documentation/GFDL-AM4.0
mirror inputData inputData
In the above example, you are changing directories and then copying (mirroring) an entire directory recursively to your local direct>ory
which is also named the same as the source.
Example-3
For downloading data, given the direct path to the files, one can also use curl -O.
E.g curl -O ftp://nomads.gfdl.noaa.gov/users/Ming.Zhao/AM4Documentation/GFDL-AM4.0/inputData/README.AM4_run