DUTIES: Develop high performance, distributed computing programs using Big Data technologies such as Hadoop, NoSQL, text mining, and other distributed environment technologies. Use Big Data programming languages and technology to write code, complete programming and documentation, and perform testing and debugging of applications utilizing JAVA, J2EE, Spring, Oracle SQL, JavaScript, and JQuery. Work with Map-Reduce, Hadoop, PIG, Hive, and NoSQL. Analyze, design, program, debug, and modify software enhancements and new products used in distributed, large scale analytics and visualization solutions. Interact with data scientists and industry experts to understand how data needs to be converted, loaded, and presented. Analyze and interpret research to evaluate and recommend solutions. Generate visualizations of data using graphic tools.
REQUIREMENTS: Requires a Bachelor’s or foreign equivalent degree in Computer Science, Computer Engineering, or Mechanical Engineering and three (3) years experience in the job offered or three (3) years of experience using Big Data programming languages and technology to write code, complete programming and documentation, and perform testing and debugging of applications utilizing JAVA, J2EE, Spring, Oracle SQL, JavaScript, and JQuery. Experience therein to include six (6) months working with Map-Reduce, Hadoop, PIG, Hive, and NoSQL.
AT&T is an Affirmative Action/Equal Opportunity Employer, and we are committed to hiring a diverse and talented workforce. EOE/AA/M/F/D/V *np*
At AT&T, we’re bringing it all together. We deliver advanced mobile services, next-generation TV, high-speed Internet and smart solutions for people and businesses. That’s why we stand alone as a fully integrated solution provider.... more