Apr 30, 2023 - Codibly1 is hiring a remote Data Engineer with Python. 📍Location: Poland.
Codibly is a technology & consulting company focused on custom software and product development. We are a community of thinkers and innovators. People who want to code the difference and constantly develop their skills. Codibly's industrial expertise area concentrates mainly on the energy & utility sector, but we collaborated with various companies from fintech, healthcare, HR, and other industries.
Be here.
Be charged up.
YOU WILL BE RESPONSIBLE FOR
-
Developing dedicated software solutions for the renewable energy sector.
-
Discovering, designing and implementing technical solutions.
-
Implementing machine learning algorithms.
-
Designing and creating big data solutions.
-
Implementing new features from the requirement to the production deployment.
-
You will be a member of a Scrum team and exchange ideas with your colleagues on the team on a daily basis.
-
Demonstrating excellent collaboration skills in working closely with other development teams/ project stakeholders/ clients.
-
Providing technical leadership for the development team and mentorship for junior colleagues.
-
Designing and writing technical documentation.
IN RETURN YOU CAN EXPECT:
-
A non-corporate atmosphere full of openness to innovation, improvement, teamwork, and a data-driven approach.
-
Remote or hybrid work model.
-
Private health care package.
-
Training budget .
-
Paid time off (24 days per year).
-
Laptop.
-
No dress code.
-
Cafeteria-style benefits plan.
-
eTutor platform
-
At least 5 years of commercial experience in software development projects and/or deployment projects, ideally in cloud environments or network.
-
Advanced knowledge of Python and its libraries like TensorFlow, Keras or PyTorch.
-
Experience in working with the Snowflake data warehouse.
-
Experience in designing and creating big data solutions.
-
Knowledge of machine learning and deep learning techniques.
-
Knowledge of cloud Azure.
-
Experience in working with tools for processing and analyzing big data sets (Apache Spark).
-
Knowledge of data visualization tools.
-
Knowledge of design patterns and programming principles.
-
Knowledge of DevOps area and tools like Ansible, Terraform, Kubernetes or, Docker (nice to have)
-
Experience in Agile / Scrum practice.
-
Good English communication skills (min. B2) – both verbal and written.
TECH STACK:
-
Python
-
Snowflake
-
SQL
-
Git
-
Numpy, Pandas
-
Azure
-
Kubernetes
-
Docker
-
Airlow - nice to have