About Netskope
Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security.
Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Melbourne, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events (pre and hopefully post-Covid) and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive. Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter @Netskope .
About the role
Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience.
The Data QE team is responsible for the quality of data, data services, and data components across our cloud and hybrid cloud environments. We develop tools, create fully automated regression suites and conduct performance tests for distributed data components at cloud scale. If you thrive on solving difficult problems, complex test scenarios, and developing high-performance QE tooling and automation, we would love to discuss our career opportunities with you.
What's in it for you
You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills.
What you will be doing
You will be writing tests and automation for our big data workflows You will help to validate ingestion pipelines and query platforms while working closely with the Data Engineers to optimize and test cloud-scale data infrastructure and platforms
Required skills and experience
8+ years of experience Solid test automation experience in Python, Go, Java or any other language Good understanding of data structures and algorithms and excellent programming skills Experience in API testing and testing microservice based architecture Expert in test plan and test case documentation Good understanding of data ingestion pipeline and data querying services Experience with Docker or container management platforms like Kubernetes Working knowledge of SQL and/or no-SQL datastores like Elasticsearch, MongoDB,Postgres, Big Query Good Understanding of DB Internals Strong verbal and written communication skills Bachelor's / Master's degree in Computer Science or a related field
Education
BSCS or equivalent required, MSCS or equivalent strongly preferred
#LI-SC3
Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.
Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.