Tipo de anúncioOferta de emprego Tipo de contratoTempo determinado Habilitações académicasLicenciatura Experiência profissional4-7 anos MoedaAKZ Forma de pagamentoMensal Outras competênciasGeneral Duties & Responsibilities
Working across a number of business areas providing development, maintenance and support
Working as part of a team and occasionally solo developments as the business needs arise
Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple game franchises.
Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.
Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.
Works directly with business analysts and data scientists to understand and support their use cases
Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.
Designing and maintaining data systems and databases; this includes fixing coding errors and other data-related problems.
Mining data from primary and secondary sources, then reorganizing said data in a format that can be easily read by either human or machine.
Using statistical tools to interpret data sets, paying particular attention to trends and patterns that could be valuable for diagnostic and predictive analytics efforts.
Demonstrating the significance of their work in the context of local, national, and global trends that impact both their organization and industry.
Preparing Dashboards as a Self-Service BI for executive leadership that effectively communicate trends, patterns, and predictions using relevant data.
Collaborating with programmers, engineers, and organizational leaders to identify opportunities for process improvements, recommend system modifications, and develop policies for data governance.
Creating appropriate documentation that allows stakeholders to understand the steps of the data analysis process and duplicate or replicate the analysis if necessary.
Select appropriate datasets and data representation methods
Keep abreast of developments in the field
Help identify probable causes and provide immediate solution during an incident
Work within an agile environment following an agile framework.
Contribute significant ideas for making the applications better and easier to use
Participate in cutting edge research in artificial intelligence and machine learning applications.
Contribute to engineering efforts from planning and organization to execution and delivery to solve complex, real-world engineering problems.
Identifying data storage requirements. ETL developers determine the storage needs of the company. They need a bird's eye view of the data situation to choose the best suiting option.
Building a Data Warehouse. After figuring out the needs, the ETL developers build a data warehouse tailored to an organization's needs.
Building reliable data pipelines : a sum of tools and processes that bring data to the user. Pipelines connect data between systems and transfer it from one format into another.
ETL processes. Once the data warehouse is complete, the ETL developer extracts the data and delivers it to the new system.
Debugging. ETL developers rectify all problems with the warehousing system.
Skills and Experience
Preferred skills in SQL, noSQL, Pig, Matlab, SAS, Java, Ruby, C++, Perl, and APIs to work with available data sources
Evaluate and improve all aspects of our existing ETL system and build New ETL pipelines in BIg Data Infrastructure.
Quality Assurance. After the warehouse gets off the ground, the ETL developers run tests to ensure its stability.
Experience with big data tools and architectures, such as Cloudera Hadoop, HDFS, Hive, BigQuery, Snowflake and Spark.
Experience in Python programming language and frameworks such as Flask, AIO Understanding of data structures, data modelling and software architecture
Advanced knowledge of SQL queries
Working knowledge of telematics interfaces and streaming solutions (MQTT, NiFi, Kafka, etc.).
Experience with Cloud platforms such as Google cloud platform is must, AWS would be plus
Outstanding analytical and problem-solving skills
Job Qualifications
Bachelor's degree in Computer Science, or related technical field, or equivalent work experience.
3 - 5 years of relevant work experience.
Experience designing and implementing distributed software systems (e.g Python).
Good oral and written English communication skills
Strong grasp of established and emerging technologies, systems, platforms, and software
Ability to organize and manage multiple priorities
Technical curiosity - Willingness to explore and learn new technologies that are unfamiliar
Ability to work in a fast pace delivery oriented environment
Ability to deliver short term results while invest in long term strategic solutions
Self-starter, and Self-motivated and able to learn independently
Team player who is eager to help others to succeed