Sometimes you just want to gather a bunch of data quickly and efficiently. Python is a great way to express that want. You really only need a few libraries (which are wonderfully written, thanks Kenneth Reitz!). Here’s some code:
import requests import sqlite3 # first read their documentation - https://www.cryptocompare.com/api/#public-api-invocation - so you can understand the querystring and response. r = requests.get('https://www.cryptocompare.com/api/data/socialstats/?id=1182').text # say you want to look at code repository data. that data is somewhat nested but we can find it eventually. data = r['Data']['CodeRepository']['List'] sql = sqlite3.connect('api_data.db') cur = sql.cursor() cur.execute('''CREATE TABLE IF NOT EXISTS api_data (url text, last_update integer, open_total_issues integer)''') cur.execute("INSERT INTO api_data VALUES (:url, :last_update, :open_total_issues)", data) sql.commit() sql.close()
If you’re not too concerned with rigorous database practices, doing this kind of thing with sqlite3 is fine. You can even create an in-memory database for work in jupyter notebooks. For applications, I’d recommend setting up PostgreSQL and doing the ORM with SQLAlchemy. This way you can easily add logic to your application without having to hardcode SQL like I did above. Then you can call those scripts or routines with a cron job if you’re running Linux. Another possibility is using apscheduler to automate the calling of functions.
But yeah, data from an API into your database in a few lines of Python. Cheers.