Technical Specialist - Hadoop
1000 Universal Studios Plaza orlando, FL 32819
Are you familiar with one of the world’ s leading media and entertainment companies in the development, production, and marketing of entertainment, news, and information? Are you also a Hadoop Specialist looking for an innovative and exciting opportunity in the Entertainment Industry?
ektello is looking a Hadoop Specialist for one of our top multimedia clients in Orlando, FL. Our client boasts an impressive portfolio of news and entertainment television networks, world-renowned theme parks, a prominent motion picture company, and much more.
- Monitor and maintain all the data platform and analytical processes which includes the Hadoop Data Lake ecosystem.
- This is a support administration role with the potential of some development responsibilities on minor enhancements and break/fixes.
- Ensure all system issues are resolved in a timely manner.
- Leverage standard Issue, Problem and Change Management ITIL toolset in tracking and providing status on support work activities.
- Coordinate the necessary operations documentation and standards required for compliance and identified best practices.
- Perform all software upgrades and patching partnering with the necessary Infrastructure teams.
- Provide work estimates as needed.
- Facilitate and lead meetings with end users.
- Keeps management, team members and business stakeholders informed of critical issues, status, changes, etc.
- 5+ years of experience in development or operational support of Enterprise BI analytics systems.
- Core understanding of EDW architecture.
- Strong SQL skills in RDBMS platform (MSsql, DB2, etc.)
- 2+ years of experience in operational support of Linux server OS.
- RedHat 6 or higher certified administrator preferred.
- JDBC and ODBC connections.
- Ability to read/interpret Java.
- Ability to produce high quality technical documentation.
- Strong knowledge of Agile project management methodologies/processes.
- Able to understand and interpret Entity-Relationship, logical and physical data models.
- Able to work with various platforms and databases (i.e. DB2, SQL Server). Able to write complex SQL and leverage backend databases.
- Participate in an on-call rotation and available to work off-hours and weekends.
- Strong interpersonal skills, with a demonstrated ability to make effective decisions while working through complex system issues.
- Must be able to utilize and effectively communicate functional and technical components of an initiative to applicable parties both verbally and through documentation.
- Attention to detail, good analytical and problem solving skills and critical thinking
- Self-starter/motivator and have a proactive, agile and strategic mindset.
- Bachelor’ s degree or higher in a computer science field.
AN IDEAL CANDIDATE WILL ALSO POSESS
- 7+ years of demonstrated experience working as part of large Information Technology teams and/or consulting organizations partnering with clients/business groups to support complex Big Data platform or BI analytics environment.
- 2+ years of experience in installation and administration of Hadoop ecosystem.
- 2+ years of experience in the data integration with Hadoop ecosystem.
- Certifications in Horton Works HDP and HDF administration or development preferred.
- Experience in operational support of HDFS and edge application including Hive (TEZ, LLAP), Spark, Ranger, HBase, Knox, Kerberos.
- Experience with Nifi as ETL tool is preferred.
- Strong skill set in troubleshooting and resolving HDFS cluster issues using Ambari.