Be a key driver on BASF’s path to digitalisation by supporting existing products and initiatives as well as innovate additional digital solutions that supports BASF’s global businesses
What you can expect
Our unit “Data Enablement - Big Data Solutions” uses the highly innovative technologies to develop advanced analytics prototypes, builds and operates big storage solutions and analytic platforms for global deployment and organizes an overarching data lake, the BASF Enterprise Data Lake. Tasks:
- Secure operation and support in end to end responsibility of our BASF Enterprise Data Lake Platform
- Develop, run and automate CI/CD pipelines – Infrastructure as code
- Monitoring of cloud PaaS and IaaS components – Availability, Performance, Security
- Incident/Change/Problem Management – Continual Service Improvement
- Identification of root cause and problem solution
- Ensure BASF compliance and security guidelines on our platform
- Patch management of infrastructure
- Creation and implementation of Backup and Recovery concept
- Manage and monitor 3rd party tickets for supplier support
- Support and enable UseCases on the EDL platform
- Implementation of authorization requests
- A Bachelor or Master degree in relevant Business/IT studies with at least 5 years of experience in a similar role
- Business Consulting and Technical Consulting skills
- Working flexible and agile (Scrum knowledge appreciated) with a DevOps mindset
- An entrepreneurial spirit and the ability to foster a positive and energized culture
- A growth mindset with a curiosity to learn and improve.
- You are experienced with administration and operation of Big Data Technologies
- At least 2 years of experience in the field of basic IT technologies like operating systems, databases, webserver, HTTP communication, REST APIs and networking
- Experience with Big Data technologies such as Spark, Hadoop (Hortonworks/Cloudera Stack)
- Experience in Cloud Big Data technologies and architectures within AZURE, Google Cloud or AWS.
- Experience in securing cloud environments
- Experience with development and deployment of infrastructure as code and using Azure DevOps formerly known as VSTS
- Experience in operating of Tools like Databricks and Terraform
- Experienced in operating Linux, Java, MS/Azure SQL, SQL DWH, SAP HANA
- Team player with strong interpersonal, written and verbal communication skills.
- You can demonstrate fluent communication skills in English (spoken and written)
A challenging area of responsibility with a high degree of personal responsibility. You will get opportunity to work on cutting edge technologies on exciting digitalization projects in the area of bigdata. You will be trained "on the job" in a dedicated, competent team.