Scorecard
My goal: Develop digital fluency to significantly improve my employer's operational effectiveness, and to turn disparate, unstructured, and hard-to-find data into actionable intelligence.
Below is my self-assessment of my level of competency with various tools, and where I need to improve.
Level Guide (click to expand)
Level 1 — Awareness: Just getting started, concepts still fuzzy.
Level 2 — Basic Operator: Can use independently for routine/basic tasks (pre-junior level).
Level 3 — Applied Practitioner: Can audit, integrate with related tools, and troubleshoot simple issues (pre-junior level).
Level 4 — Junior Engineer: Can teach, build with the tool, and solve problems in organizational contexts.
Level 5 — Senior Engineer: Tool is second nature; can optimize, debug, innovate, and mentor others.
Python
I use Python for data cleaning, ETL pipelines, connecting to Google Drive and other data sources, and automating repetitive tasks. I also apply it in exploratory projects like building a CPU emulator and packaging my own utility libraries.
Inventory (click to expand)
Current Comfort Level
- Structuring Python projects optimized for scalability and collaboration.
- Writing robust programs that are documented, handle errors, log events, and run independent of environment/location.
- Designing modules and using
__init__.pyfor reusability. - Performing basic ETL.
- Setting up and using virtual environments; installing, managing and working with remote and editable packages via
pip. - Building and installing APIs.
- Performing basic quality control using
pytestandruff. - Using Git/GitHub for version control and collaboration.
- Connecting to the Google Drive environment using Google's API.
- Safeguarding secrets with
.gitignoreand environmental variables. - Robust error handling.
- Logging with
logging.
Immediate Next Steps (1–2 months)
- Become fluent with key data engineering tools such as
numpyandpandas,pathlib,requests,seleniumandscrapy. - Gain competency with tooling and code quality packages such as
sys,dis, andmypy. - Strengthen professional practices by structuring ETL as reusable functions and pipelines, and integrating programs with terminal shells.
InfoSec
I implement best infosec practices in everything I do — from securing credentials and systems to protecting the privacy of the people whose data I handle.
Inventory (click to expand)
Current Comfort Level
- Applying core infosec principles across systems and data workflows.
- Managing and tracking IAM, assigning Role-Based Access Controls, and enforcing the Principle of Least Privilege.
- Mapping and documenting data flows throughout the data lifecycle.
- Using data services that provide encryption in transit and at rest.
- Maintaining awareness of privacy-by-design and data minimization practices.
Next Steps
- Establish expertise to educate and collaborate with teams on secure data and device handling.
- Learn and practice tools for auditing and testing vulnerabilities (e.g., OWASP ZAP, Nessus, Azure Security Center).
- Develop confidence in documenting and reporting security posture and incidents.
- Implement logging and monitoring to support audit trails and anomaly detection.
Git/GitHub
I use Git/GitHub to version-control my projects, manage branches, share my work, and collaborate (with just myself currently). It allows me to track changes, experiment safely, and document my learning journey across repositories.
Inventory (click to expand)
Current Comfort Level
- Undertsanding how teams use Git and Github to keep codebases compatible and workflows aligned.
- Understanding how open source projects use Git and Github to enable broad contribution while maintaining project integrity and stability.
- Maximizing clarity and efficiency by following standard naming conventions for branches, commits, and repositories.
- Recognizing Git as a system of record, where every change is traceable
- Creating deliberate version history and protecting work with consistent use of
add,commit, andpush. - Connecting a local machine to GitHub using SSH.
- Creating repositories from terminal using
gh. - Creating and switching between branches with confidence.
- Reviewing branches and commits using
logandshow. - Merging branches using
mergeandrebase. - Undoing changes via
restore. - Managing pull requests and merges via terminal (individual and collaborative projects).
- Diagnosing and fxing merge/rebase conflicts.
- Protecting
origin/mainfrom unauthorized pushes through branch protections.
Immediate Next Steps (1–2 months)
- Improve comfort with resolving merge conflicts independently.
- Practice advanced history rewriting with
rebase -i(interactive rebase).
PowerShell
I use PowerShell to navigate and operate on directories and files, and to write bootstrap programs to get projects off the ground quickly and uniformly.
Inventory (click to expand)
Current Comfort Level
- Navigating directories.
- Running commands with conditional logic and
ForEach-Objectloops. - Building
.ps1scripts to automate and standardize project setup.
Immediate Next Steps (1–2 months)
- Strengthen fluency with core PowerShell commands and syntax to navigate, query, and manage directories and files more efficiently in terminal.
Databases
I use SQLite to persist data and test lightweight pipelines, and I am expanding toward more advanced relational database work.
Inventory (click to expand)
Current Comfort Level
- Understanding of core database design principles (e.g. tables, relationships, normalization, granularity, primary and foreign keys)
- Working knowledge of core SQL syntax.
- Practical experience using
sqlite3in Python projects for structured data storage and retrieval.
Immediate Next Steps (1–2 months)
- Build fluency by using SQL in everyday work.
- Practice designing small schemas to support ETL pipelines.
Google Cloud Platform
I use Google Cloud Console to turn my IDE + Python into a shell terminal with access to our team's data in the cloud.
Inventory (click to expand)
Current Comfort Level
- Understanding of core cloud concepts such as availability and redundancy.
- Navigating GCP architecture and creating Organizations, Projects and Services.
- Configuring billing and safeguards to stay within budget.
- Enabling APIs and creating OAuth and Service Account credentials.
Where I'm headed
Build fluency with Cloud Storage, Cloud SQL and BigQuery to conduct large-scale data projects.