Join the Better Big Bank - one of Australia’s most trusted brandsGrow your skills at a top 100 ASX listed CompanyPermanent role (salary plus benefits – see our website for more info) About usWith more than 160 years of history, we are proud of our position in the community with more satisfied customers than any other Australian bank. Every day, we work hard to bring our company purpose to life, feeding into the success of our customers and communities and not off it. We're more than just a bank with banking products. We change the lives of customers and communities. Commercial actions with heart!Our time is now. We are challenging the status quo and we're excited about our future!About the roleNow is a really exciting time to join us as we continue to enable the newest technologies to support the bank to make data enabled decisions. We’ve gone through a successful transformation and we are looking for a Data Engineer to join us to embrace the newest technologies to develop, build and maintain our cloud-based data assets which include Data Lake, Cloud Data Warehouse and Reporting. Your work with enhance our capabilities to onboard and connect with our customers as well as enable us to derive value from our data. Responsibilities include… Deliver functional and technical requirements to deliver high performing end to end solutions for various data initiatives.Develop and maintain data assets by extraction, transformation, and loading of data from a variety of data sources (structured or unstructured).Develop, test, and deploy code to a high standard in a variety of programming environments.Release planning, implementation, and code reviews.Provide application specific system analysis services to support discoveries, architectural and solution design activities.Managing delivery risks & meeting deadlines to proactively manage the health of cloud-based data assets. About you Hands on experience in cloud technologies such as AWS and/or GCP.Demonstrated experience in cloud-based data warehouse solutions using technologies such as Change Data Capture (CDC) - Kafka/Spark, Stitch, Databricks, Snowflake etc.Experience with deployment of data workloads using CI/CD tooling (Gitlab, Terraform or Ansible).Proven development experience in Python, and ETL tools like Talend, IBM InfoSphere DataStage as well as data modelling skills.Experience with Cloud EDWs, Data Lakes, Information Governance Suites (Lineage, Glossary, Reference Data).Experience in API development for analytic/ML end points.Hands on experience working with big data sets and huge volumes with focus on optimal performance for loading and retrieving data.Previous experience working in an Agile environment using tools such as Jira and Confluence. Why us?There's so much more to a career with Bendigo and Adelaide Bank than just banking. Get real benefits, work life balance and flexibility. You bring your brilliant mind and we’ll help you take your learning to the next level with on the job training and external development opportunities - we want you to shine. After all, YOU are the difference that makes us the better big bank.At Bendigo and Adelaide Bank we believe a diverse workforce supported by an inclusive culture is central to our success and we actively encourage applications from those who bring diversity of thought to our business. We support candidate requests for adjustment to accommodate an illness, injury or disability to equitably participate in the selection process.Please apply before 9am – Friday 23rd April. We will be reviewing applications as they are received so be quick to apply!
Don’t provide your bank or credit card details when applying for jobs. Learn how to protect yourself here.
Post your task and get experts help on: