The python of wall street Part 5
Or make APIs great again
Disclaimer:
This project is a seven parts project that I’ll leverage to have the opportunity to implement different technologies I want to explore.
You are welcome to re-use any part of this script. But I would not advise using it on the stock market with your money. If you do, I am in no way responsible for whatever may result of it.
part 1: extracting data and computing trend indicators
part 2: creating an ETL pipeline for data quality and centralization
part 3: creating a classification model
part 4: automatic retraining of the model
part 5: create apis to access the data
part 6: data visualization
part 7: create docker container for micro services architecture
part 8: process automation
What is an API
API stands for Application Programming Interface. An API is a software intermediary that allows two applications to talk to each other. It is the messenger that delivers your request to the provider that you’re requesting data or a service from and then delivers the response back to you.
It is a channel that enables communication between an appication and a database. A REST API would send the data from the database back as a json format.
An API includes built in functionalities that are independant of their implementations, providing generic building blocks for creation of further applications without imposing a specific technology or architecture.
Why do you need it
It enables developpers to create processes that from scratch would be repetitive and higly complex by relying on an underlying reusable piece of code. Thus making application building faster, more resilient and agile.
Moreover, consider a situation where you want to retrieve some data from a website, say a list of new items. Without an API, you would have to physically go to the site, look into the source code and create a web scrapper manualy. This is very time cosuming and resources consuming and doing this instead of concentrating on the actual data you are going to use will impact the developement of the application.
Now suppose there is an API, the developer can retrieve the data easily, or even update the data without downloading entire data to apps backend.
IN a nutshell, APIs make developement faster by enabling to work more on the front end and the actual applications rather than data pipeline, data acquisition or updating your database. Once you have an API you can easily move and launch your cross platform apps and scaling up.
How to build yours
In a web app, you need to process incoming request data from users. Flask is a python based web framework which enables to access the data by querying the database easily. THe output of the query will be the data as a json format
In this tutorial, we’ll go through how to process incoming data for the most common use cases. The forms of incoming data we’ll cover are: query strings, form data, and JSON objects. To demonstrate these cases, we’ll build a simple example app with three routes that accept either query string data, form data, or JSON data.
In Flask, the get some data, you need to use the request object. This request object holds all incoming data from the query, which includes the mimetype, referrer, IP address, raw data, HTTP method, and headers.
To gain access to the request object in Flask, you simply import it from the Flask library.
from flask import Flask, request
We then use the route()
decorator to tell Flask what URL should trigger our function. In our case, it will be as follow:
@app.route('/')
@app.route('/query-stock')
This will indicate to the Flask server to query the http://x.x.x.x:port/query-stock url
stock_symbol = request.args.get(‘stock_symbol’).upper()
This will be used as an input containing the symbol of the stock to query for.
stock_symbol = request.args.get('stock_symbol').upper()
SQL = "SELECT * FROM master_record WHERE name = '"
SQL = utils.generate_query(SQL,stock_symbol)+"'"
df = utils.get_data(MYDB, SQL, NAME_DICT)
df = df[df.D_value != -100]
df = df[df.K_value != -100]
df = df[df.RSI != -100]
df = df[df.lower_list != -100]
df = df[df.upper_list != -100]Conclusion
Then, based on the symbol of the stock, a query will be sent to the database before cleaning some of the features containing the -100 as those datas were inserted to compensate for the null coming from the moving averages computations.
if __name__ == '__main__':
app.debug = True
app.run(host='0.0.0.0', port=5000)
FInally, this will pull all of those scripts to expose it via a web interface, resulting in an interface that can be reached via for instance (for tesla): http://127.0.0.1:5000/query-stock?stock_symbol=TSLA
To run the script, type python3 master_api.py. This will result as follow:
If the query you send is succesfull, in the terminal you’ll see:
and in the browser:
Conclusion
Developers are now much more productive than they were before when they had to write a lot of code from scratch. With an API they don’t have to reinvent the wheel every time they write a new program. Instead, they can focus on the unique proposition of their applications while outsourcing all of the commodity functionality to APIs.
Moreover, specifically for this project, once the APIs are ready and I’ll move to micro services, the different APIs can be used to make the containers communicate between each other but also build other applications or monitoring systems every steps of the way.
It will also make it easier to get data from the database in a cross environment way to build front end applications, for instance data vizualisation in the browser.
Next Steps
Now that the data can be accessed easily in the browser, the next step will be to use the api to create some graphs and display them in the browser. It will be the opportunity to discover d3.js and react.
But also to compare their capabilities and performances to other tools like elastic search and kibana.
References
https://scotch.io/bar-talk/processing-incoming-request-data-in-flask
https://blogs.mulesoft.com/biz/tech-ramblings-biz/what-are-apis-how-do-apis-work/