Small python script to automatically ingest the data to asset
With an api key you can get the NICFI planet data and push it to earth engine
Please note that the current script copies the data to a Google Cloud Bucket and then ingests it to the earth engine. You might be charged for this operation.
import json
import requests
from pprint import pprint
import os
import ee
ee.Initialize()
# insert your api key here
API = ''
url = 'https://api.planet.com/basemaps/v1/mosaics?api_key=' + API
mosaics = requests.get(url).json()['mosaics']
print(list(map(lambda m: m['first_acquired'],mosaics)))
# select the relevant year here
moi = mosaic_jsons[-(len(mosaic_jsons))]
print(moi)
moi = mosaics[3]
# set the area of interst
lx = 102.08822490378263
ux = 107.89009812650694
ly = 10.075476746964359
uy = 15.477468765578934
quads = moi['_links']['quads']
#print(quads)
quad_formatted = quads.replace('{lx}',str(lx))\
.replace('{ly}',str(ly))\
.replace('{ux}',str(ux))\
.replace('{uy}',str(uy))
quad_fetch = requests.get(quad_formatted)
quad_fetch = quad_fetch.json()
pprint(quad_fetch['_links'])
pprint(quad_fetch['items'][0])
images = []
nnext = quad_formatted
while nnext:
next_fetch = requests.get(nnext).json()
images.extend(next_fetch['items'])
try:
nnext = next_fetch['_links']['_next']
except Exception as e:
nnext = False
pprint(images[0])
print(len(images))
downloadurls = list(map(lambda quad: [quad['id'],quad['_links']['download']], images))
downloadurls.sort()
for i in range(0,len(images)):
url = downloadurls[i][1]
name = downloadurls[i][0].replace('-','_')
cmd = "wget {0} -O - | gsutil cp - gs://myfolder/../{1}.tif".format(url,name)
os.system(cmd)
cmd = "earthengine upload image --asset_id=/id/{0} gs://myfolder/../{0}.tif".format(name)
os.system(cmd)
thank you for your blog.