5

I have seen the answer to Elevation info for multiple lat long coordinates and I would like to be able to do this with a python script instead. I cannot use Googles API because I have 8000 points. I have them as part of a .csv file which I import into pandas as a dataframe.

something like:

elevation_list = [] def elevation_function(df): for lat,long in zip(df.lat,df.lon): elevation_i = some_call_to_file(lat,long) elevation_list.append(elevation_i) 

My coordinates span the western half of the US so Im not sure the function could be so simple. I have downloaded all SRTM90M resolution tiffs from this amazing person

3
  • gis.stackexchange.com/a/228968/2856 Commented Oct 11, 2019 at 4:53
  • It isn't quite as simple as this question. I have multiple lat,longs that span multiple tiffs. So first I would need to download all the files. Second I would need to right a function that parses the meta data and chooses that right file to look for each coordinate. I am sure some could make it work but I was not up to the task. This online method that @Ben Gosak produced is working like a charm. Commented Oct 11, 2019 at 5:09
  • You can create a virtual mosiac withgdalbuildvrt Commented Oct 11, 2019 at 5:11

1 Answer 1

9

Don't know if this is the simplest way, but it saves gathering elevation data. The USGS-National Map has a REST service that you can use to query elevation for lat/lon coords.

Service url: https://apps.nationalmap.gov/epqs/ (new) (old, deprecated as of March 1, 2023: https://nationalmap.gov/epqs/)

You can use pythons requests library and format your query string according to the service parameters. You need your input coordinates in NAD83 (lat/lon).

import requests import urllib import pandas as pd # USGS Elevation Point Query Service #url = r'https://nationalmap.gov/epqs/pqs.php?' #new 2023: url = r'https://epqs.nationalmap.gov/v1/json?' # coordinates with known elevation lat = [48.633, 48.733, 45.1947, 45.1962] lon = [-93.9667, -94.6167, -93.3257, -93.2755] # create data frame df = pd.DataFrame({ 'lat': lat, 'lon': lon }) def elevation_function(df, lat_column, lon_column): """Query service using lat, lon. add the elevation values as a new column.""" elevations = [] for lat, lon in zip(df[lat_column], df[lon_column]): # define rest query params params = { 'output': 'json', 'x': lon, 'y': lat, 'units': 'Meters' } # format query string and return query value result = requests.get((url + urllib.parse.urlencode(params))) #elevations.append(result.json()['USGS_Elevation_Point_Query_Service']['Elevation_Query']['Elevation']) #new 2023: elevations.append(result.json()['value']) df['elev_meters'] = elevations elevation_function(df, 'lat', 'lon') df.head() 
11
  • 1
    @Bstampe Yes, projecting to web mercator in advance will work. No limit to the number of requests, the process sends a unique request to the service for each coordinate pair. Seemed to be taking about 1-2 seconds per coordinate, so you may want to add a print statement so you know it's running. 5000 refers to spacing between the coordinate values. Commented Oct 10, 2019 at 18:51
  • No worries, I would slice off the first 100 points and test with this process to get a better time estimate. I reran with %timit and got: ~4 sec for 6 records, so you could expect about 1.5 hrs for 8k records. Commented Oct 10, 2019 at 19:05
  • Check out the pyproj library for a python solution. Commented Oct 10, 2019 at 20:24
  • I have converted from lat lon to utm. how does your function know what zone of utm you are in? Commented Oct 10, 2019 at 20:29
  • I see I didn't understand the difference between web mercator and utm. This post has a nice function for converting lat lon to web mercator stackoverflow.com/questions/56647700/… Commented Oct 10, 2019 at 20:40

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.