Hi,

One sec...

Enter

Led a team of 3 developers on a nightly job to migrate School District data to remove burden of manual data entry.Features

Problem

Our company had built a robust learning management system (LMS) but we didn’t have a way to sync to a school district’s existing system; their schools had to do it manually through Excel uploads or by filling out forms on the front end, which required maintaining data in two places. For small schools this process was time-consuming and error prone and for school districts it was a deal-breaker.

Our CEO asked me to build a system that would allow our platform to sync with any external LMS system. A discovery phase revealed that most school districts model their data after IMS standards and make their data accessible through CSV files or a REST API.

Impact

Schools were unable to use the LMS for up to 3 days or longer depending on the school size. The school day would be interrupted due to manual entry errors (e.g. Student is registered in the wrong class) and that would impact the student’s, teacher’s, counselor’s and IT staff’s day trying to resolve the error.

Solution

An ETL application (Extract Transform Load) that would sftp into a school district’s server and monitor changes in CSVs and update our LMS system nightly and have the foundation for eventual REST integration.

After researching SaaS products that could handle this process for us, I realized none of them was an “end to end” solution that could handle our complicated data massaging hence I needed a custom ETL tool.

I came up with an architecture that would include a cron job to check for new CSV records nightly, migrate them to S3 for archive purposes, and trigger a lambda to handle our complex data massaging and update the necessary databases across our APIs. I additionally designed our application to allow IT users to enter their SFTP and REST credentials to instantaneous sync their data with no manual involvement on our apart.

After completing a prototype, I assigned the remaining tasks to my development team for completion.

Sprinkle of Sales

I made sure to build transaction monitoring into the scope because I learned from previous ETL jobs the importance of knowing the source of data movement. This feature could later be built into a full-blown dashboard for school districts to monitor their data pipelines. The Sales team loved this idea because we could potentially charge more for it.

Architectural Diagram

Results


Video Overview

Led a team of 3 developers on a nightly job to migrate School District data through CSVs instead of burdening school administratorsFeatures with manual data entry.
Client Journey:

Initial Client Meetings

  • Came up with an architecture to pull CSVs from their servers nightly
  • Advised their IT teams on how to structure their exported data

Consulted on these items:

  • Usernames of parents should be their emails to make it easier to remember
  • Not spend development time handling > 300K records since nightly updates are only ~100 records.

Project Completion

School principals thanked us for never having to fill out a student form again.



Video Overview:

Architected a microservice to offer multiple apps document storage where the User decides how their data can be used.Features

Problem

Our company was building a document sharing application for healthcare and education providers to gather research from marginalized communities. I repeatedly heard from stakeholders that users in the community were apprehensive about sharing data. They didn’t want it shared with a 3rd party without their consent, worse-still, shared with government authorities due to poor experiences with law enforcement officials.

Impact

The providers were not getting “buy in” from their users hence application use was minimal or certain features that required users to share personal information was being ignored.

Solution

I suggested we build an application that went beyond document sharing to include data transparency as the core of the data model to ultimately let the user know exactly how their data is being used. It could have further features to empower the user such as setting expiration dates and deciding permissions on the documents they share.

I made sure to emphasize that the data model and overall application architecture honor the permission levels set by the user and not have the front end decide what can be shown.

Application Flow
  1. Admin user (e.g. HR department) requests to see an employees document through the web app.
  2. Admin input what they plan on doing with the data.
  3. Enduser notified on their mobile app about the request.
  4. Document is automatically added to Admin user’s desired folder (saving time on manually dragging/dropping)
  5. Enduser notified via push notifications for any document activity they want to monitored (e.g. sharing, opening).
  6. Background activities commence based on enduser preferences (e.g. document deletion)

Results


Video Overview

Served as Tech Lead to create a SPA application to allow city residents to pitch their ideas for technological solutions that could benefit the city

Features:

  • Architected front-end application using React/Redux
  • Leveraged Redux containers to minimize code and simplify operations for future developers
  • Implemented Sagas Pattern to handle async operations
  • Supervised development of back-end using Python and Django framework
  • Implemented Django ODBC driver to MS SQL Server for data layer
  • Adhered to TDD design patterns for code maintainability and scalability

 


Joys
  • Learning the exciting new framework everyone was talking about 😉
  • Transitioning from the question “asker” to question “answerer” as my skills with React increased
Lessons Learned
  • Wrapping my head around the component architecture approach of React versus the more MVC approach of Angular
  • Configuring the MS SQL ODBC driver with Django with limited documentation since it expects a connections to a data source like Postgres.