Case Study



A system that analyzes blockchain data repositories and turns inert data into workable and valuable financial assets

Customer profile: The Maker Foundation, the organization bootstrapping MakerDAO

MakerDAO is famous for DAI — a collateralized, decentralized digital currency pegged to the USD. Currently, MakerDAO has $5 billion locked as collateral and $2 billion DAI in circulation.

DAI is particularly useful in unbanked communities with currency depreciation and increasing inflation, where people lack access to conventional financial services.

Challenge: The inefficient process of collecting and processing financial data hampers crypto provider’s decision-making

The client required numerous reports and analytics to process market data like cryptocurrency rates and interest rates showing fluctuations in cryptocurrencies rates. Armed with this analytical data, the client would get actionable insights to allow more rational decision making.

Data sourced from blockchain is highly technical and detailed, which makes gathering it manually a daunting task, prone to data loss. The Maker Foundation needed to set up an ETL (Extract, Transform, and Load) pipeline to gather data and keep sourced data records, but since they didn’t have enough data science expertise in-house, the client turned to Unicsoft.



PostgreSQL, MySQL, Google BigQuery DB, Python Flask, APScheduler, pandas, SQLAlchemy, PostgreSQL DB



Team augmentation



DeFi, Fintech



Data Science

We synchronized work on the Flask server and APScheduler to avoid data duplication, improved the client’s scripts, and wrote a new method of loading data to the Postgres database. With the scripts sorted out, analysis became easy, and the whole process became fully automated.

ETL Pipeline for crypto analytics that drive data-based decisions

Project approach: Smoothly augmenting the client’s team with an experienced data scientist

As soon as the client reached out to us looking for candidates, we arranged a swift recruitment process according to their requirements and specifications.

While all the candidates we presented were highly skilled and experienced, the client chose a specific talent that they found to be the best option. After our data scientist had finished setting up the ETL pipeline and gathering analytics, he shared his knowledge about the project and supported the smooth transition of the project to an in-house data scientist from the client’s team.

We succeeded in delivering this project thanks to a number of factors: the skills of our engineer, a network of professionals we could choose from, a fast hiring process, and in-depth engagement with the client’s business.

Result: Complete and relevant DeFi analytics

We helped our client develop an automated process for gathering data from blockchain through an ETL pipeline—a solution that helped extract data from different sources, transformed it into the required format, and loaded it into the client’s database. Today, the entire process has become automated, requiring minimal support from the client’s side. They can now get complete and relevant DeFi analytics faster than ever before.