Data Engineer

Compensation

: $84,860.00 - $126,040.00 /year *

Employment Type

: Full-Time

Industry

: Information Technology



Loading some great jobs for you...





Skills bull Hadoop, HDFS, Spark bull Hive Hbase bull Python, Java bull ETLdata modeling Risk is a consolidated platform for Risk Management and Risk Reporting globally. It is supported by the Enterprise Technology Solutions (ETS) within Global Functions Technology Services Organization. Risk provides the strategic infrastructure for Institutional Clients Group, Commercial Bank, Consumer Bank and Investment Products, Treasury and Enterprise Risk Management. Risk is a cross product, cross domain and top of the house subject. This gives an opportunity to implement build a big data platform around this huge data set as well as next generation analytics on top of this such as cross domain risk, secondary effects of risky events as well as behaviour of portfolio under various scenarios- across all stripes of risk, capital optimization, balancing risk and PL etc. Job Purpose The purpose of the job is to design, develop, enhance enterprise applications in Risk Technology area using Big Data technologies using Spark Key Responsibilities Interacting with Business analysts to understand the requirements behind BRDsFRDsSRs Complete understanding of application code through code compilation, code walkthrough, execution flow, overall design Local compilation, deployment and behaviourUnit testing Identifying the areas where code needs to change for meeting the required functionalities and maintain traceability Participate in design review code review meetings- localglobal Unit testing, Integration testing, UATSIT support Code check in, check out, merge, build management as needed Reporting to the Program manager on projecttask progress as needed. Identify risks issues Participate in all project planning, progress development meetings with the team global managers Qualifications KnowledgeExperience At least 5 to 7 years of Application development experience through full lifecycle. Experience with Red Hat Linux and Bash Shell Scripting. Knowledge in Core Java and OOPs is required. Thorough knowledge and hands on experience in following technologies Hadoop, Map Reduce Framework, Hive, Sqoop, Pig , Hue, Unix, Java, Sqoop, Impala. Cloudera certification (CCDH) is an added advantage. Strong experience in any ETL and BI tools Skills Conceptual understanding of data structures Passion for technology and self- starter Orientation towards Disciplined development processes Strong Software Development Lifecycle management experience Qualifications B. Tech from a top engineering college, University, preferably in computer science. Other preferred branches are EE, ECE. Candidates with passion for coding and systems development from other disciplines also can apply. Work experience in a product company is an added advantagePandoLogic. Category: Technology, Keywords: Data Engineer
Associated topics: data analyst, data architect, data engineer, data quality, data scientist, data warehousing, hbase, mongo database, sql, teradata * The salary listed in the header is an estimate based on salary data for similar jobs in the same area. Salary or compensation data found in the job description is accurate.

Launch your career - Create your profile now!

Create your Profile

Loading some great jobs for you...