Sunday, October 12, 2014

[Atlassian] Data Scientist


Data Scientist

Location: Austin
Job Code: QHP-1104
# of openings: 1

Description

Atlassian is looking for a Data Engineer to join our Growth team and manage our marketing data pipeline that powers crucial business decisions throughout the organization.

  • Do you dream about Data and talk to your friends in SQL?
  • Do you want to inspire and be inspired by working with the best and brightest?

Position Summary

You’re friendly, positive, professional, and fun to work with!

You’re a creative thinker with excellent problem­ solving and decision ­making ability. You’re proactive, self­ starting, organized, and willing to take on difficult problems.

You have excellent communication skills, both written and verbal. You’re self­motivated, energetic, and passionate.

You’ll be the genius who understands data at Atlassian, knows where to find it and manages the process to make that data useful for Marketing Analytics.   You love thinking about the ways the business can consume this data and then figuring out how to build it.   On a typical day you may be consulted on the information architecture of our website and help design the event collection infrastructure. Then move on to building the data models and ETL processes to provide this data for business use. You've got some practical experience working with large datasets. You are interested in reporting platforms and data visualization.

You may be interested in machine learning or statistics or one of several similar fields. But the most important factor is your have a strong foundation in coding and building data pipelines. As path of the Atlassian Growth team, you'll own a problem end-to-end, so those skills will come in handy not just to collect, extract and clean the data, but also to understand the systems that generated it, and automate your analyses and reporting.

On an on-going basis, you'll be responsible for improving the data by adding new sources, coding business rules and producing new metrics that support the business.

���As a data engineer, you have experience spanning traditional DW and ETL architectures and new big data ecosystems like Hadoop/EMR and Redshift. You’ve probably been in the industry an engineer and have developed a passion for the data that drives businesses. This role is all about helping to guide the business, so in addition to facility with data and programming skills, you'll need to write clear, professional documents accessible to both technical staff as well as business leaders. You'll need to be technical at heart, comfortable with the idea of extending systems by writing code, rather than just relying on in-built functionality.

Who we are looking for:

  • You can build / maintain transformation scripts for different types of data.
  • You're an expert in some scripting languages (JavaScript, Python, Shell, etc.) that you use to supplement system capabilities.
  • Deep understanding of SQL, SQL tuning, and schema design
  • You have extensive experience with data quality and data-manipulation (ETL) tools that convert data into actionable information.
  • You have experience with Hadoop/MapReduce/AWS/EMR/Redshift for large data processing
  • You have experience in logical and physical data modelling
  • Excellent understanding of scheduling and workflow frameworks and principles
  • Comfortable with Linux command line tools and basic shell scripting
  • Comfortable with Agile software development methodologies
  • Excellent communication skills.
  • You're flexible and don't let system limitations stop you from getting things done.
  • You can empathize with business users and anticipate how they'd like to use information.
  • You're the type of person that publishes good documentation to help people use the information you provide.
  • You pay meticulous attention to end-to-end data quality, validation, and consistency
  • You have a graduate degree in Computer Science or similar discipline

Preferred Skills:

  • You've used version control systems like Git.
  • You have experience retrieving data from remote systems via API calls (eg REST)
  • You have experience with test automation and continuous build
  • You have an understanding of data mining and machine learning

A few more things to know:

How to apply

Please submit your resume and cover letter. Your cover letter should address the points listed above ("Who we are looking for").




Apply

No comments:

Post a Comment