Added addition context
This commit is contained in:
27
README.md
27
README.md
@@ -1,5 +1,30 @@
|
||||
# LMS-DB-ETL
|
||||
An Extract, Transform, Load app to gather book information from public APIs for a POC LMS project
|
||||
An Extract, Transform, Load (ETL) app to gather book information from public APIs for
|
||||
a Proof of Concept Library Management System project.
|
||||
|
||||
(Past Git history can be found at: https://github.com/Kalarsoft/LMS-DB-ETL and
|
||||
https://gitea.com/NickKalar/LMS-DB-ETL)
|
||||
|
||||
## Problem
|
||||
Currently, I am working on building a Library Management System (LMS) to help
|
||||
develop and showcase my software engineering skills. In order to fully test
|
||||
and run the LMS, I need to have a database that is populated by a variety of
|
||||
different media. As I am one person, and have only about 300 books to my name,
|
||||
this problem needed a better solution that manually adding in those books.
|
||||
|
||||
## Solution
|
||||
This project seeks to seed a database with book details, mostly pulled from
|
||||
public APIs. The current version uses the Google Books API and Open Library
|
||||
API. After pulling data from these APIs for several books, the data is merged
|
||||
and transformed to be loaded into a PostgreSQL database for consumption by the
|
||||
RESTful APIs associated with the LMS project.
|
||||
|
||||
This is a rudimentary ETL pipeline, as it uses no external tools and uses only
|
||||
2 Python libraries for making the API calls and connecting to the database.
|
||||
However, it does showcase my understanding of Data Engineering and the ETL
|
||||
cycle.
|
||||
|
||||
## Setup
|
||||
|
||||
Environmental Variables:
|
||||
`GOOGLE_API_KEY` - API Key required for using the Google Books API.
|
||||
|
||||
Reference in New Issue
Block a user