Watch this video to learn more about Google Inc.
Job Type
Job Details
- Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.
- 8 years of experience with security assessments, security design reviews, or threat modeling.
- 3 years of experience in a technical leadership role; overseeing projects, with 2 years of experience in a people management, supervision/team leadership role.
- Experience managing a team of engineering or security operations professionals for an organization.
- Experience as a technical security professional prior to management.
Preferred qualifications:
- Experience with AI systems and testing techniques for AI-powered products.
- Experience with writing Python code in production environments.
- Experience managing bug bounties.
- Experience working on sensitive escalations that include external parties.
- Understanding of AI principles and best practices.
- Excellent problem-solving and critical thinking skills with attention to detail in a fluid environment.
There's no such thing as a "safe system" - only safer systems. Our Security team works to create and maintain the safest operating environment for Google's users and developers. As a Security Engineer, you help protect network boundaries, keep computer systems and network devices hardened against attacks and provide security services to protect highly sensitive data like passwords and customer information. Security Engineers work directly with network equipment and actively monitor our systems for attacks and intrusions. You also work with software engineers to proactively identify and fix security flaws and vulnerabilities.
You are a recognized expert in at least two security domains and use your leadership skills to manage a team that sets the direction and goals for solving Google-wide problems. You identify fundamental security problems at Google and drives major security improvements in Google infrastructure.
In this role, you will drive operational excellence within the Vulnerability Rewards Program (VRP) team, ensuring efficient processes, clear communication, and effective collaboration across Trust and Safety. You will be instrumental in navigating the evolving landscape of generative Artificial Intelligence (AI) vulnerabilities, developing strategies and solutions to mitigate risks, and ensure the responsible development and deployment of Generative AI (GenAI) technologies.
The US base salary range for this full-time position is $189,000-$284,000 + bonus + equity + benefits. Our salary ranges are determined by role, level, and location. The range displayed on each job posting reflects the minimum and maximum target salaries for the position across all US locations. Within the range, individual pay is determined by work location and additional factors, including job-related skills, experience, and relevant education or training. Your recruiter can share more about the specific salary range for your preferred location during the hiring process.- Provide guidance and technical direction to a team of technical Individual Contributors (ICs) responsible for supporting both the Abuse VRP and the GenAI VRP.
- Design, implement, and manage a new comprehensive GenAI VRP program to incentivize and reward external security researchers for identifying and responsibly disclosing vulnerabilities and potential abuse scenarios.
- Develop and maintain clear program guidelines, eligibility criteria, and reward structures that align with industry best practices and legal requirements. Develop and implement automation and tooling to streamline repetitive tasks.
- Engage with various teams within Trust and Safety, Product, and VRP community to align priorities and establish a shared understanding of the role of Vulnerability Rewards Programs in mitigating abuse and security risks.
- Work with sensitive content or situations and may be exposed to graphic, controversial, or upsetting topics or content.
Build for everyone Since our founding in 1998, Google has grown by leaps and bounds. Starting from two computer science students in a university... Read more