Big Data Solution Architect Job Description
Big Data Solution Architect Duties & Responsibilities
To write an effective big data solution architect job description, begin by listing detailed duties, responsibilities and expectations. We have included big data solution architect job description templates that you can modify and use.
Sample responsibilities for this position include:
Big Data Solution Architect Qualifications
Qualifications for a job description may include education, certification, and experience.
Licensing or Certifications for Big Data Solution Architect
List any licenses or certifications required by the position: AWS, TOGAF, BI
Education for Big Data Solution Architect
Typically a job would require a certain level of education.
Employers hiring for the big data solution architect job most commonly would prefer for their future employee to have a relevant degree such as Bachelor's and Master's Degree in Computer Science, Technical, Engineering, Information Systems, Mathematics, Software Engineering, Education, Math, Business, Electrical Engineering
Skills for Big Data Solution Architect
Desired skills for big data solution architect include:
Desired experience for big data solution architect includes:
Big Data Solution Architect Examples
Big Data Solution Architect Job Description
- Support technical pre-sales efforts and attend customer meetings as needed, to act as a trusted advisor to clients during the sales process including exploration discussions, deal formulation, team selection, and project launch
- Understand and constantly monitor the vast big data ecosystem and its motivations
- Provide technical thought leadership regarding the Big Data & DevOps software ecosystem, including existing software ecosystem and solution landscape
- Identify customer needs and requirements on a detailed level and consequently matching these back to proposed services or solutions
- Act as a Big Data & DevOps "evangelist" through the attendance and speaking engagements at industry conferences on key topics in the DevOps industry
- Understand Business Challenge- Clearly understand business challenges and their impact, and assess the business value of solution alternatives
- Work closely with our global partners
- Travel varies by geographic territory, but can be up to 25%
- Enable you to work by removing mundane processes from your daily routine
- Challenge you with some of the most complex problems in today’s market
- Intimate working knowledge of the full Systems Engineering life cycle (Requirements, Analysis, Design, Implementation, and Testing) project management methodologies
- The position is ideally located in Washington, D.C., but we are open to Telework/Remote location for the ideal candidate
- Strong verbal and written communications skills are a must, the ability to work effectively across internal and external organisations and virtual teams
- Technical – Web services development/deployment experience, IT systems and network engineering experience, security and compliance experience
- Economic and business – RFP/Acquisition support
- Ability to think strategically about business, product, and technical challenges in an enterprise environment
Big Data Solution Architect Job Description
- Support your personal ambitions and help you learn new technologies
- Presentations/proposals will be made to high level stakeholders ( C-level, VP, government officials) –candidates must have experience presenting to high value stakeholders
- Consulting Engagements include short on-site projects proving the use of AWS services to support new solutions
- Help to define and architect Hitachi Solutions to be made available in the centralized EDC (European Distribution Center) facility to be accessible to showcase functionality, solution fit, solution benefits and more with the primary focus on Big Data, Analytics and IoT solutions
- Collaborate with big data experts, data scientists, consultants in Pentaho and Hitachi to deliver advanced custom demonstrations, workshops, and POCs
- Key stakeholder in collaboration with product management, engineering, and professional services across Pentaho and Hitachi
- Maintaining a clean and operable working lab space in the EDC to serve the need of showcasing Hitachi solutions to customers
- Work with Account teams and customers to ensure that the POC/Demo request and documentation process is followed consistently
- Oversee the “change management process” when client or account team needs or requirements change during the planning, deployment or execution of a POC/Demo
- Attend on-going training to keep technical skills up to date with respect to Hitachi products, applications, big data and analytics solutions
- Highly technical and analytical, possessing 10 or more years of analytics platform implementation or operations experience
- Implementation and tuning experience of Data Warehousing platforms, including knowledge of Data Warehouse Schema Design, Query Tuning and Optimisation, and Data Migration and Integration
- Must have minimum 8 years' software development experience
- Must have 3+ years Hadoop-related architecture experience
- Must have 2+ years architecting Big Data hardware/software deployments
- Designing solutions with a good understanding of cluster and parallel architecture
Big Data Solution Architect Job Description
- Build an on-going log of “lessons learned” for each technology area to help drive improvements in future field and/or customer engagements
- Deconstructing all environments once completed and ensuring proper restock accounting for all the components used
- Drive conceptual and logical architecture design for new initiatives
- Ability to provide technical recommendations and trade-offs to address business needs and timelines and drive to resolution
- Participate in cross-functional, cross-discipline architecture teams to enhance / set the architectural direction for key business initiatives
- Serve as a fully seasoned, technically proficient resource
- Influence, Negotiate and Lead technology alternative evaluations and implementations across the Technology and Line of Business organizations
- Guides solution architecture of small- and mid-size applications in functional areas
- Contributes to enhancements and maintenance of corporate standards, architecture and development guidelines addressing new reporting and analytical use cases, technologies, approaches
- Defines evolutionary steps of the Analytics Platform technical roadmap, drives creation of small and mid-size foundational features
- Data warehouse, BI and ETL tools
- Some knowledge of NoSQL databases types such as OLAP, Graph, Key Value, Object, XML, JSON, SQL, NOSQL, Tuple Store, Columnar, in-memory
- Must have exposure to developing solutions using continuous integration
- Active participation the open source communities
- Graduate degree in Computer Science, Statistics or an equivalent Engineering degree required
- 8 -10 years of working experience in normal solutions architecture big data architecture
Big Data Solution Architect Job Description
- Providing BI advisory services by contributing to cross-functional cross-application integration for reporting demands/projects from different regions and functional units
- Partner with business and technology stakeholders to drive future state architecture for our enterprise Data, Reporting and Analytics platform and solutions
- Lead or develop proof of concepts and innovate in solution development with new technology frameworks
- Engage closely with the tech and data teams and evangelize adoption of standard architectural practices in every day development work
- Actively participates in HPE professions program and Practice Improvement activities
- Minimum of 1 year of experience with major GCP data platform components (BigQuery, BigTable, DataFlow, DataProc, DataPrep, Pub/Sub, Machine Learning)
- Detailed understanding of the Cloud Infrastructure components and GCP components in specific
- Minimum of 3 years of development, and implementation of enterprise-level data solutions utilizing large data sets
- Engineer/Architect will design, perform POC where needed and develop enterprise’s Apache Kafka Distributed Messaging and Integration Ecosystem
- Analysis and design of IT concepts
- Good experience in streaming data processes Storm, Impala, Oozie, Mahout, Flume, ZooKeeper concepts
- Good knowledge data governance and compliance regulations GDPR
- Deep experience with a wide variety of common BI tools and emerging tools such as Platfora, DataMeer, Alteryx
- Experience with Enterprise Data Warehouse Platforms such as Teradata, Netezza, Exadata, MSSQL, DB2
- Strong understanding of network configuration, devices, protocols, storage technology, data center processes, Linux, LDAP/Kerberos
- 5+ years of pre-sales client facing experience architecting and/or implementing large, distributed data warehouses, big data, or real-time data intensive applications
Big Data Solution Architect Job Description
- Participation in the creation and validation of new solutions and their coordination with IT Architects
- Define and document architecture artefact(s) such as reference architecture, framework, principles, integration pattern(s) that shall be applied to Core Banking Application
- Solution and design Core Banking application architecture that meets the functional requirement and non-functional requirement
- Actively participle in all phases of the SDLC to ensure that architecture is implemented in accordance to the approved design, framework and software stack
- Perform proof-of-concept (POC) on new technologies and approaches to ensure the proposed architecture is sound
- Ability to work with various focus team(s) such as development, infrastructure, business, etc to understand the requirement, current technologies landscape and develop a sound architecture
- Ability to use different tool(s)
- Responsible for end-to-end solution of the mission critical applications involving Big Data stack and modern ML packages
- Responsible for benchmarking existing solutions, identifying gaps, and proposing solutions to eliminate them
- Responsible for clearly articulating technical solution to senior stakeholders
- Strong aptitude for learning new technologies, and evaluating fit for a client environment
- Experience with Docker/microservices and related patterns of deployment is preferred
- Understanding of industry standard data integration architectures
- 8+ years of Enterprise Data Warehouse and Big Data experience - functional and/or technical experience
- Understanding of enterprise integration tools and technical solution components
- Familiarity with Architecture Frameworks and Methodologies (Proact/Zachman/TOGAF or equivalent)