Led a team of 3 developers on a nightly job to migrate School District data to remove burden of manual data entry.Features
Our company had built a robust learning management system (LMS) but we didn’t have a way to sync to a school district’s existing system; their schools had to do it manually through Excel uploads or by filling out forms on the front end, which required maintaining data in two places. For small schools this process was time-consuming and error prone and for school districts it was a deal-breaker.
Our CEO asked me to build a system that would allow our platform to sync with any external LMS system. A discovery phase revealed that most school districts model their data after IMS standards and make their data accessible through CSV files or a REST API.
Schools were unable to use the LMS for up to 3 days or longer depending on the school size. The school day would be interrupted due to manual entry errors (e.g. Student is registered in the wrong class) and that would impact the student’s, teacher’s, counselor’s and IT staff’s day trying to resolve the error.
An ETL application (Extract Transform Load) that would sftp into a school district’s server and monitor changes in CSVs and update our LMS system nightly and have the foundation for eventual REST integration.
After researching SaaS products that could handle this process for us, I realized none of them was an “end to end” solution that could handle our complicated data massaging hence I needed a custom ETL tool.
I came up with an architecture that would include a cron job to check for new CSV records nightly, migrate them to S3 for archive purposes, and trigger a lambda to handle our complex data massaging and update the necessary databases across our APIs. I additionally designed our application to allow IT users to enter their SFTP and REST credentials to instantaneous sync their data with no manual involvement on our apart.
After completing a prototype, I assigned the remaining tasks to my development team for completion.
I made sure to build transaction monitoring into the scope because I learned from previous ETL jobs the importance of knowing the source of data movement. This feature could later be built into a full-blown dashboard for school districts to monitor their data pipelines. The Sales team loved this idea because we could potentially charge more for it.
Architected a microservice to offer multiple apps document storage where the User decides how their data can be used.Features
Our company was building a document sharing application for healthcare and education providers to gather research from marginalized communities. I repeatedly heard from stakeholders that users in the community were apprehensive about sharing data. They didn’t want it shared with a 3rd party without their consent, worse-still, shared with government authorities due to poor experiences with law enforcement officials.
The providers were not getting “buy in” from their users hence application use was minimal or certain features that required users to share personal information was being ignored.
I suggested we build an application that went beyond document sharing to include data transparency as the core of the data model to ultimately let the user know exactly how their data is being used. It could have further features to empower the user such as setting expiration dates and deciding permissions on the documents they share.
I made sure to emphasize that the data model and overall application architecture honor the permission levels set by the user and not have the front end decide what can be shown.
Architected .NET middleware application to automate processes for a health foundation to coalesce three different APIs – Survey Monkey, Dynamics CRM, SharePoint.Features
Our company was midway through a CRM integration for one of the largest health foundations when they hit an impasse. One department was reluctant to adopt the CRM because they felt it wasn’t useful. The project’s success was contingent on every department using the CRM seeing as the ultimate goal was to provide data visibility across the organization.
If they don’t join, it’ll thwart the whole project’s effort to democratize our data. – Foundation CTO
The Advocacy department’s primary responsibility was surveying the foundation’s user base for outreach feedback. During discovery I learned their workflow had many manual steps that were ripe for automation and they were using 3rd party reporting and document sharing tools to compensate for a unified system.
To replace their existing manual data collection methods, I built a custom middleware application that would leverage webhooks in their survey software (Survey Monkey) to monitor for submissions and run this workflow:
To handle high survey throughput we utilized concurrent processing to process responses 5x faster. I additionally made a custom dashboard for them to monitor survey activity and contact changes to provide full visibility and reporting on their user base.
I suggested we don’t just build this middleware as a standalone service; I proposed a larger engagement to allow this service to be abstracted to allow any kind of sync with an external system. The Advocacy department’s automation would be the first of many. Our COO was delighted by this suggestion because it meant a larger proposal.
Some departments were hesitant to join the CRM due to data privacy; I had to carefully implement policies to ensure all contacts were shared but not all data was visible for each contact.
Ensuring we understood the Advocacy department’s goals and workflows so the CRM brought them benefit otherwise their reluctance would thwart the CTO’s efforts to democratize the foundation’s data. I had to really flex my consulting and onboarding skills to unify all departments to sharing data and using one system.
Served as Senior Software Engineer on an award-winning software as a service (SaaS) content management system (CMS) that powers over 800 newspapers Features and media outlets worldwide. Consulted business leaders on best practices in platform integrations to expedite onboarding and optimize the readership experience.
Our company had a powerful CMS but migrating media outlets from their previous system to ours proved daunting because in some cases they were abandoning their old system for ours and we simply needed a one time ETL job; for other clients it would be a more complicated ongoing sync between systems. We were originally building a custom integration for each client but that proved too expensive.
We built a multi-tenant REST API to allow newsrooms to easily sync with the CMS to run and monitor ETL operations and ensure all their news articles existed in both systems. The API would be customizable for each newsroom no matter their data model and scaleable enough to handle 30+ newspaper companies migrating up to 100K records totaling to 3 million records potentially migrated every minute.
Many newspaper outlets do not write all their own stories and publish articles from external providers via a process called “Wires”. These stories are typically available as feeds such as RSS, Apple, Facebook, etc. and our system needed to handle all of these as well as any potential ones in the future.
I collaborated on a scheduler application to retrieve the latest stories for external feeds and migrate them to the multi-tenant API mentioned above.
One newspaper relieved heavily on distributing their content through SMS but they were having complications between 3rd parties. Their text message platform Braze was effective at dissemination but had a poor UI and no reporting. They had a reporting tool, SoFi but it did not have Braze integration.
Build a user friendly application that would integrate into our existing suite of tools that would handle all the SMS selection and creation but would leverage the Braze API as a backend. Additionally any metrics captured would be automatically routed to the reporting tool to create a one way binding between providers.