Big Data Architect Job Description
Big Data Architect Duties & Responsibilities
To write an effective big data architect job description, begin by listing detailed duties, responsibilities and expectations. We have included big data architect job description templates that you can modify and use.
Sample responsibilities for this position include:
Big Data Architect Qualifications
Qualifications for a job description may include education, certification, and experience.
Licensing or Certifications for Big Data Architect
List any licenses or certifications required by the position: AWS, TOGAF, RHCSA, DOD, CWAPT, CEPT, CPT, CEH, CREA, CISSP
Education for Big Data Architect
Typically a job would require a certain level of education.
Employers hiring for the big data architect job most commonly would prefer for their future employee to have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Engineering, Technical, Business, Math, Mathematics, Software Engineering, Information Technology, Computer Engineering, Education
Skills for Big Data Architect
Desired skills for big data architect include:
Desired experience for big data architect includes:
Big Data Architect Examples
Big Data Architect Job Description
- Evaluate and recommend emerging technologies and options for data integration and data management on a Hadoop Big Data platform Apache Spark framework
- Designing normalized data models for historical data warehouses
- Design and Developing with ETL tools
- Designing and developing with MOLAP/ROLAP/OLAP Tools
- Designing and developing with Relational Reporting Tools
- Designing and developing data architecture and data flow on Big Data Platforms
- Provide technical analysis
- Decompose components of the architecture for peer programming staff to develop
- Check program code/logic for accuracy and efficiency
- Prepare programs for production deployment and hand-off to operations and support staff
- BS in computer science of related field
- Should be able to articulate business/technology trends/POV in the Big Data space
- Knowledge with regards to how to optimize the design / schemas / for each of the components
- Experience with implementing and addressing operational issues scripting and monitoring in Big Data environments
- Experience in Reporting Architecture in a relational database environment
- Advanced Degree in technical discipline and/or equivalent technical experience
Big Data Architect Job Description
- End-to-End Cloud Data Solutioning and data stream design with Hadoop, Storm, Hive, Pig, Spark, AWS (EMR, Redshift, S3, )/Azure (HDInsight, Data Lake Design) Manage Big Data project and workstream with an emphasis on AWS skillset
- Stay current with new development in big data
- 10% Manage web analytic project prioritization based on business needs
- Help perform source analysis and supports departmental process improvement initiatives
- Assist with logical and physical model design, source-to-target mapping, use cases, and other development processes and artifacts
- Identify data subject areas, entities, attributes, relationships, information types, and domains
- You will work with Big Data software engineers and other Big Data architects to design and develop solutions for our clients, aligned with the business, development, deployment, and maintenance needs
- You will support architectural decisions and tasks within a product line or across multiple product lines (cross-portfolio)
- You will be challenged by creating designs and dictate technical standards
- Design robust, scalable systems
- A minimum of 10 years software development experience, 7years of which are specific to Big Data technologies
- Experience with processing large data stores and MapReduce programming in Apache Hadoop and Hadoop Distributed File System (HDFS)
- 15+ years of big data and relevant engineering experience while meeting expertise requirements
- Experience with design and implementation of enterprise data integration solutions and technologies such as Panzura, Talend, Google DataFlow, etc
- Minimum 7-10 years of hands-on design, architecture and delivery experience implementing large and nimble analytical or data warehousing solution including firm understanding of ETL and visualization tools
- The Big Data Architect will be responsible for guiding the full architectural lifecycle of a Big Data solution, including requirements analysis, governance, capacity requirements, technical architecture design (including hardware, O/S, and network topology), application design, testing and deployment
Big Data Architect Job Description
- Design and develop the architecture for all data warehousing components, including Real-Time Data Ingestion techniques, Transformations, aggregations, and related data quality strategy
- Partner with BI teams, Data Integration developers, Data Scientists, Analysts and DBA’s to deliver well-architected and scalable Information Management eco-system
- Formalize standards and best practices for Big Data applications, Real-time data integrations and establish the architecture for scalable analytics eco-system
- Support Global Technology and Line of Business (LOB) with major incidents, changes and customer requests
- Replication setup / management
- System support, troubleshooting and resolution
- DR testing / recovery / backups
- Data porting
- Vendor management / escalation
- Participate in troubleshooting, support and value added initiatives
- You have a high bar for operational excellence
- You enjoy mentoring junior engineers and promote a culture of continuous learning
- Experience with NoSQL databases like MongoDB, Cassandra
- Experience in programming languages like Java, Scala
- Practical experience with Linux/Unix operating systems
- Ability to architect and design technology solutions encompassing multiple products and make decisions based on impact across the stack
Big Data Architect Job Description
- Drive information strategy and analysis engagements for clients’ business groups
- Delivery Lead and lead developer for multiple big data projects
- Knowledge of data science, machine learning and statistical modeling techniques in banking domain
- Work closely with the model development group to understand and meet business needs through the appropriate design and implementation of the model(s)
- Deep understanding of rich data visualizations to communicate complex ideas to business leaders
- Design and implement data ingestion techniques for real time and batch processes for a variety of sources into Hadoop ecosystems and HDFS clusters
- Visualize and report data findings creatively in a variety of visual formats that provide insights to the organization
- Define and document architecture roadmaps and standards
- Ensure scalability and high availability, fault tolerance, and elasticity within big data ecosystem
- Provide technical leadership and coaching to junior team members
- Should be able to translate requirements to reusable technical design patterns taking into account industry best practices
- Ability to understand the big picture, understand risks and communicate effectively to team/relevant stakeholders
- Certification in a leading Hadoop distribution
- Knowledge of SparkR, Python, or similar languages
- Advanced knowledge of SQL and relational databases
- 4 years of MongoDB and other NoSQL database concepts and design
Big Data Architect Job Description
- Designing and implementing data pipelines on Big Data and/or NoSQL platforms to enable rapid prototyping and accelerating the path to production
- Creating data management solutions covering data security, data privacy, metadata management, multi-tenancy and mixed workload management on Big Data and/or NoSQL platforms
- Makes decisions that have a major day to day impact onarea of responsibility
- Provide support for data warehousing initiatives and develop the data warehousing system to ensure reliability and accuracy of information loaded into the databases
- Independently design, code, and test major features, work jointly with other team members to deliver complex changes
- Provide technical leadership for our common data platform services
- Collaborate with product management and internal customers to turn functional use cases to technical specifications
- Design and implement the architecture for data patterns
- Architect and design Big Data Services in the cloud
- Develop engineering search and catalog capabilities for Big Data analytics
- 2+ years of experience in ETL design, data warehousing and Analytics
- Experience in RESTful API design and implementation
- Knowledge of Unix/Linux and shell and python programming
- Experience with monitoring tools like Nagios, Munin, Zenoss
- Experience with data design tools IBM Data Architect, Erwin, PowerDesigner
- Based platform