Jobs Career Advice Signup
X

Send this job to a friend

X

Did you notice an error or suspect this job is scam? Tell us.

  • Posted: Dec 13, 2023
    Deadline: Not specified
    • @gmail.com
    • @yahoo.com
    • @outlook.com
  • Never pay for any CBT, test or assessment as part of any recruitment process. When in doubt, contact us

    We deliver open source to the world faster, more securely and more cost effectively than any other company. If you're interested in a career at Canonical, we are a remote-first company so please apply to any suitable role as skills are valued more than location, despite some having a preferred geographic preference.
    Read more about this company

     

    Software Engineer - Data Infrastructure - Kafka

    • We are looking for candidates from junior to senior level with interests, experience and willingness to learn around Big Data technologies, such as distributed event-stores (Kafka) and parallel computing frameworks (Spark). Engineers who thrive at Canonical are mindful of open-source community dynamics and equally aware of the needs of large, innovative organisations.

    Location: This is a Globally remote role
    What your day will look like
    The data platform team is responsible for the automation of data platform operations, with the mission of managing and integrating Big Data platforms at scale. This includes ensuring fault-tolerant replication, TLS, installation, backups and much more; but also provides domain-specific expertise on the actual data system to other teams within Canonical. This role is focused on the creation and automation of infrastructure features of data platforms, not analysing and/or processing the data in them.

    • Collaborate proactively with a distributed team
    • Write high-quality, idiomatic Python code to create new features
    • Debug issues and interact with upstream communities publicly
    • Work with helpful and talented engineers including experts in many fields
    • Discuss ideas and collaborate on finding good solutions
    • Work from home with global travel for 2 to 4 weeks per year for internal and external events

    What we are looking for in you

    • Proven hands-on experience in software development using Python
    • Proven hands-on experience in distributed systems, such as Kafka and Spark
    • Have a Bachelor's or equivalent in Computer Science, STEM, or a similar degree
    • Willingness to travel up to 4 times a year for internal events

    Additional Skills That You Might Also Bring
    You might also bring a subset of experience from the followings that can help Data Platform to achieve its challenging goals and determine the level we will consider you for:

    • Experience operating and managing other data platform technologies, SQL (MySQL, PostgreSQL, Oracle, etc) and/or NoSQL (MongoDB, Redis, ElasticSearch, etc), similar to DBA level expertise
    • Experience with Linux systems administration, package management, and infrastructure operations
    • Experience with the public cloud or a private cloud solution like OpenStack
    • Experience with operating Kubernetes clusters and a belief that it can be used for serious persistent data services

    Method of Application

    Interested and qualified? Go to Canonical on boards.greenhouse.io to apply

    Build your CV for free. Download in different templates.

  • Send your application

    View All Vacancies at Canonical Back To Home

Subscribe to Job Alert

 

Join our happy subscribers

 
 
Send your application through

GmailGmail YahoomailYahoomail