Project Context and Environment

In a complex digital landscape, user trust and data security are critical to building responsible and competitive products. Our organization had launched a Safety by Design program to embed privacy and security principles into every phase of the product engineering lifecycle — from concept through deployment.

However, there was no formal method to measure the program’s effectiveness, particularly in terms of application security practices. At the same time, our development teams were expanding rapidly, and existing security automation processes needed to evolve accordingly to keep pace without adding friction to the user experience or overburdening developers.

Leadership needed answers:
Was the program truly working?
Where were the security blind spots?
Were engineers following secure coding practices consistently and measurably?

My Role as the Project Manager & Data Analyst

I was responsible for leading, developing, and executing a high-impact initiative to:

  • Evaluate the performance of the Safety by Design program using real engineering and application data.

  • Build and deploy analytics dashboards tracking 24 security-focused performance metrics across software engineering teams.

  • Identify actionable insights to enhance security automation, reduce human intervention, and improve developer engagement.

  • Ensure findings are translated into practical improvements that elevate both security posture and user trust.

  • Serve as the liaison and connector, bridging technical and non-technical teams to support decision-making and solution design from multiple teams.

Execution Strategy

1. Stakeholder Alignment & Goal Definition

  • Collaborated with application security leads, engineering managers, and compliance teams to define key performance indicators (KPIs) tied to secure development practices and risk mitigation.

  • Aligned the project scope with broader organizational goals around DevSecOps maturity and privacy-by-default principles.

2. Metric Selection & Dashboard Development

  • Worked with data engineers to define 24 critical data points, including:

    • Frequency and coverage of static/dynamic code scans

    • Time to resolve security issues

    • Secure coding training adoption rates

    • Pull request security review compliance

    • Risk acceptance trends and exceptions

  • Engineered the design and implementation of real-time dashboards, ensuring accessibility for both executive stakeholders and engineering teams.

3. Data Collection & Performance Analysis

  • Integrated analytics into the CI/CD pipeline and internal developer tools to passively collect relevant data without disrupting workflows.

  • Performed trend analysis and benchmarking across product teams to identify:

    • Automation blind spots

    • Underperforming teams or security bottlenecks

    • Opportunities for early security intervention before deployment

4. Recommendations & Automation Enhancements

  • Presented insights in the cross-functional security commission to inform:

    • Prioritization of automation tooling upgrades

    • Training enhancements based on team-specific gaps

    • Refinement of the Safety by Design policy with clearer thresholds and accountability models

  • Partnered with security automation engineers to refine rule sets and reduce false positives in scanning tools.

Results and Impact

Security Visibility and Governance

  • Delivered live dashboards used by 5+ engineering teams and executive security leads to monitor security KPIs in real time.

  • Enabled data-backed policy decisions that improved oversight and reduced manual audits.

Performance & Automation Gains

  • Identified and closed 100+ security coverage gaps in pre-production environments.

  • Reduced average time to remediate high-risk vulnerabilities by 42%.

  • Increased automation coverage in secure code review processes by 45%.

Culture & Program Maturity

  • Increased developer participation in secure coding practices by 60%, based on training completion and tool adoption rates.

  • Set a new internal benchmark for measuring application security effectiveness in product development.

Project Statement


The Project initiative demonstrated how thoughtful use of engineering metrics and automation can turn security from a reactive burden into a proactive, value-driving capability. By grounding the Safety by Design program in real, measurable performance data, we strengthened our security defenses, improved developer productivity, and built a stronger foundation of digital trust for our users and stakeholders.
I pay special attention to this project because it allowed me to wear different “hats” in the same effort, bringing my expertise in leading projects, while leveraging my data analytics skills to mitigate budget and resource risks, especially when the teams are working at full capacity and the budget is limited. Although significant constraints were imposed, I created measurable improvements in security culture, developer engagement, and automation reliability for our organization. Delivering results in complex environments.

Previous
Previous

Software Remediation

Next
Next

Business Process Reengineering