Data Architect, Big Data Resume Samples
4.7
(103 votes) for
Data Architect, Big Data Resume Samples
The Guide To Resume Tailoring
Guide the recruiter to the conclusion that you are the best candidate for the data architect, big data job. It’s actually very simple. Tailor your resume by picking relevant responsibilities from the examples below and then add your accomplishments. This way, you can position yourself in the best way to get hired.
Craft your perfect resume by picking job responsibilities written by professional recruiters
Pick from the thousands of curated job responsibilities used by the leading companies
Tailor your resume & cover letter with wording that best fits for each job you apply
Resume Builder
Create a Resume in Minutes with Professional Resume Templates
CHOOSE THE BEST TEMPLATE
- Choose from 15 Leading Templates. No need to think about design details.
USE PRE-WRITTEN BULLET POINTS
- Select from thousands of pre-written bullet points.
SAVE YOUR DOCUMENTS IN PDF FILES
- Instantly download in PDF format or share a custom link.
HP
H Parisian
Heber
Parisian
3391 Little Hills
Chicago
IL
+1 (555) 407 3129
3391 Little Hills
Chicago
IL
Phone
p
+1 (555) 407 3129
Experience
Experience
Houston, TX
Data Architect, Big Data
Houston, TX
Nicolas-Prohaska
Houston, TX
Data Architect, Big Data
- Review and manage work assignments for team in an Agile development environment
- Effective listener, who can give timely and constructive feedback. Is willing to state opinions assertively and deals constructively with conflict situations
- A problem solver as well as willing to compromise for a common effective and efficient solution
- Work as a team; cross trains as well communicates effectively and efficiently in a virtual global team environment
- Develop and maintain relationships with key business and technology partners
- Ensure all code changes are managed according to the company best practices
- Identify opportunities for improvement as well as innovate
Phoenix, AZ
Solutions Architect Big Data
Phoenix, AZ
Doyle-Conroy
Phoenix, AZ
Solutions Architect Big Data
- Conduct performance reviews, leadership development and evaluations of direct reports
- Work with Business Stakeholders and Project Managers to understand inefficiencies in clients’ existing business processes and applications and recommend solutions
- Engage in technology sourcing and selection activities on, working with third-party vendors to identify the solution options
- Work collaboratively with clients, intermediaries and internal CACI technical experts to build high impact solutions
- Lead deliveries of business projects and sub-programmes; working with a diverse team of architects, technologists and third parties
- Adheres to configuration management and documentation practices
- Expertise in architecting digital and multi-channel technology solutions, covering all layers of the technology stack, from presentation through to physical infrastructure
present
Houston, TX
Think Big Principal Data Architect
Houston, TX
McCullough Inc
present
Houston, TX
Think Big Principal Data Architect
present
- Development of data flows and templates, often working with the customer’s developers to help them learn how to use Kylo
- Analyze complex distributed production deployments, and develop a plan to optimize performance
- Recommend and design integration with third-party systems and network management tools
- Development of Java extensions to the Kylo framework
- Lead Kylo developer training in a classroom setting
- Present the Kylo roadmap, vision, proposed data lake architectures and business outcomes to all levels – developers, architects, CTO, CIO
- Installation and testing of Kylo framework
Education
Education
Bachelor’s Degree in Computer Science
Bachelor’s Degree in Computer Science
Washington State University
Bachelor’s Degree in Computer Science
Skills
Skills
- Basic knowledge of machine learning, statistics, optimization or related field is
- Knowledge on NoSQL platforms
- Demonstrated experience and success managing projects
- Experience with the major big data solutions like Hadoop, Map Reduce, Hive, Pig and other tools in the Hadoop ecosystem
- Understanding of major programming/scripting languages like Python, Scala
- Experience modeling data in SAP BW and SAP HANA
- Experience working with large data sets and distributed computing tools
- Planning and administration of project tasks and dependencies
- Facilitating team status meetings
- Developing project related documentation
15 Data Architect, Big Data resume templates
Read our complete resume writing guides
1
Big Data Software Architect Resume Examples & Samples
- Work with global architecture team to drive the technical strategy and architecture
- Mentor engineers, complete hands-on technical work and provide leadership on complex technical issues
- Lead the big platform research, design and implementation
- Foster and encourage growth of technical team members by providing technical input, advice, coaching, guidance, deliverable review, etc
- Act as a technical expert and make significant contributions to all stages of product development, including but not limited to helping engineering team to get deep understanding of business requirements, technical requirements and technical operation requirements
- 8+ years' experience of requirement analysis, design, development and implementation of large-scale and high performance distributed platform and applications
- At least 3 years leadership experience in architectural design, technical decision-making and solution estimation
- Expert of big data technology including Hadoop/Spark/Storm
- Hands-on skill in coding/debugging/troubleshooting
- Broad knowledge on data warehouse data modeling, design patterns, refactoring, unit test. Familiar with agile, continuous integration and continuous delivery etc
- Excellent written and good spoken communication skills in English
- Experience in large data sets, especially real-time streaming data is big plus
- Knowledge of the database implementations is a plus
- Network Infrastructure (Routing and Switch) experience is a big plus
- Participating famous open source distributed system is a big plus
- LI-APJ-LL1
2
Big Data Technologist / Architect Resume Examples & Samples
- Architecting and building and implementing a data platform over Big Data technologies
- Leading innovation by exploring, investigating, recommending, benchmarking and implementing data-centric technologies for the platform
- Being the technical architect and point person for the data platform
- Collaborating with application engineering teams and providing platform support for applications in a very agile environment
- Passion and interest for all things distributed - file systems, databases and computational frameworks
- Hands on programming and development experience; excellent problem solving skills; proven technical leadership and communication skills
- Having a solid track record building large scale, fault-tolerant systems over the whole lifecycle of the project - you have spent significant time and effort observing large-scale systems in production and learning from it
- Strong understanding of how the various technical pieces fit together: you can explain why you made architectural/design decisions and chose certain tools/products
- Having made active contributions to open source projects like Apache Hadoop or Cassandra
- Experience in large scale Stream processing frameworks such as Storm, Streambase and S4 is a strong plus
- Experience in analytic frameworks such as Impala, Phoenix (for HBase), Berkeley Analytics stack (Spark/Shark) and Pandas/NumPy is a strong plus
- Prior experience with large scale distributed RDBMS (Teradata, Netezza, Greenplum, Aster Data, Vertica) is a plus
- Knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2/S3/EMR, RackSpace Cloud, OpenStack) is a plus
- 5+ years of programming experience (Java and Python on Linux)
- 3+ years of hands-on experience with key-value store technologies such as HBase and/or Cassandra
- 2+ years of experience with the Hadoop stack - MapReduce, Pig, Oozie, Scribe/Flume etc
- 2+ years of hands-on experience doing production troubleshooting and performance tuning of the Hadoop stack
3
Solutions Architect Big Data Resume Examples & Samples
- Lead deliveries of business projects and sub-programmes; working with a diverse team of architects, technologists and third parties
- Preparation of technology blueprints, supporting technical documentation and technology roadmap
- Engage in technology sourcing and selection activities on, working with third-party vendors to identify the solution options
- Be part of a wider technology design authority, managing architectural changes, and maintaining adherence to agreed digital roadmaps
- Play a leading role in supporting business development activities, as a technology subject matter expert
- Partner with key technology, business stakeholders and analysts to ensure well-formed, practical strategies
- An experienced solution architect, able to identify gaps and shape a comprehensive, end-to-end design and technology strategy
- Recognised as a trusted adviser by senior executives, with a track record of creating successful technology solutions that support a strategic vision and goals
- Able to bring to bear extensive knowledge of architectural patterns, technology components, vendor solutions and emerging technology trends on digital solution development in one or more industries, retail preferred
- A confident written and verbal communicator able to present at all levels
- Experience in engaging and leading the sourcing and selection process for third-party hardware, software and other technology components and services, as required to enable delivery
- Able to bring technology leadership to support successful business development of digital opportunities
4
Software Architect, Big Data Resume Examples & Samples
- Directly responsible for the system architecture, system software, and creation of the logical deployment architecture for the eCommerce division
- Identify and implement strategies for service enabled technologies. Aligns the architecture with business objectives and company technology direction
- Experience in strategies to architect and design re-usable components across website and mobile
- Manage multiple high level priorities
- Advanced written and verbal communication skills. Ability to effectively communicate technical issues and solutions to all levels of business
5
Big Data / Data Layer Architect Resume Examples & Samples
- Order prototype activities to evaluate proposed changes to the Architecture domain you are responsible for
- Ensure the re-use of common components
- Ensure the quality attributes of the system are maintained within desired parameters
- Conduct architecture proofs including prototyping/benchmarking activities
6
Big Data Infrastructure Platform Build Architect Resume Examples & Samples
- Infrastructure and big data platform design, review, and implementation plans in support of the application requirements
- Develop implementation plans. Work closely with platform delivery team, solutions architect , and application architects to ensure the environments are build accurately as per the standards and on time
- Drive optimum utilization of system resources
- Provide early consultation to Application Development teams as new development efforts are launched - to help ensure that infrastructure and platform best practices are followed and result in better integration of application and infrastructure
- Work with Application Architecture and Development teams to identify requirements for distributed computing services
- Work with Global Technology Infrastructure (GTI) and Application Architecture teams to create fully integrated application computing architectures
- Develop and build center of service excellence through process optimization in requirement gathering, interaction with development team, systems provisioning, app deployment and commissioning
- Monitoring framework: ensure the infrastructure is monitored adequately. Identify gaps if any, and update the framework appropriately
- Performance tuning and capacity planning- lead and develop framework to ensure the production performance SLAs are met and defined capacity headroom is always maintained
- Cyber security- Making sure application infrastructure is compliant with firms policies and security requirements
- Process automation and continuous integration
- Proactively identify new technologies promising efficiency (cost/productivity) in operations and carry out proof of concepts
- Work closely with enterprise architects to collaborate on the development and maintenance of infrastructure platform Technology Product and Services Roadmap
- Work with engineering team to define best practices and processes as appropriate to support the entire infrastructure lifecycle – Plan, Build, Deploy, and Operate such as automate lifecycle activities – self service, orchestration and provisioning, configuration management etc
7
Big Data Technical Architect / Sales Engineer Resume Examples & Samples
- Candidate needs to have hands-on experience with Hadoop applications (e.g. administration, configuration management, monitoring, debugging, and performance tuning)
- Experience with one of the large cloud-computing infrastructure solutions Amazon Web Services or Microsoft Azure
- Must be able to benchmark systems, analyze system bottlenecks and propose solutions to eliminate them
- Must be able to work creatively and analytically in a problem-solving environment
- Must be able to work in a fast-paced agile development environment
8
Big Data Technical Architect Resume Examples & Samples
- Experience with Hadoop, MapReduce, Hive, HBase, MongoDB, Cassandra
- Experience in Impala, Oozie, Mahout, Flume, ZooKeeper and/or Sqoop
- Major programming/scripting languages like Java, Linux, PHP, Ruby, Phyton and/or R
- Experience in working with ETL tools such as Informatica, Talend and/or Pentaho
- Experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS and/or knowledge on NoSQL platforms
- Skilled architect with cross-industry, cross-functional and cross-domain know-how
- Experience with data security and privacy
- Must have excellent written and verbal communication skills
- Must be able to perform detailed analysis of business problems and technical environments and use this in designing the solution
- Must be able to work in teams, as a big data environment is developed in a team of employees with different disciplines
- Deep understanding of SOA principles and Web Services technologies: REST & SOAP
9
Big Data Modeler / Architect Resume Examples & Samples
- Provide leadership and guidance to project teams and other architects on all aspects of BI architecture (RDBMS (SQL, Netezza, etc), Hadoop architecture (if applicable), SAP Modeling (HANA or BW, if applicable) normalized and dimensional data modeling, ETL, reporting, etc
- Provide guidance and consultation on specific delivery methodologies for BI projects, such as technical requirements and model design review
- Work with the Enterprise Data Modeler (Information Management team) to set strategy and oversee design for significant data modeling work, such as Enterprise Logical Models, Conformed Dimensions, Enterprise Hierarchy etc
- Lead efforts to define/refine execution standards for all data warehouse layers (ETL, data modeling, MOLAP/ROLAP/OLAP, reporting, platform etc.)
- Participates in meetings to review the design of BI projects. This will include high-level design of the overall solution and detailed design of components as needed (Data Warehousing, ETL, user interface, analysis/reporting, etc.)
- Regularly interact with BI leadership on project work status, priority setting and resource allocations. Provide assistance to project teams as they go before change control boards to implement their projects into production
- Research new tools and/or new architecture and review with project teams as applicable
- Work with support team to define methods for implementing solutions for performance measurement and monitoring
- Assist infrastructure leads and BI Delivery teams as needed with background and information on all technologies in use for projects such as new version upgrades, migration of hardware, production issues, etc
- Provide leadership and guidance on setting up environments used by the BI team so that they are optimized for a leveraged, multi-tenant operation
- Designs and implements data ingestion techniques for real time and batch processes for structured and unstructured data sources into Hadoop ecosystems and HDFS clusters
- Designs strategies and programs to collect, store, analyze and model data from internal/external sources. Awareness and understanding of public data sets ability to ingest and integrate
- Development and implementation of data design methods, data structures, and modeling standards. Implement industry standard development policies, procedures and standards
- Leverages industry networking and contacts to benchmark with peer customers and share knowledge and best practices
- 8+ years overall experience in IT applications development or consulting related to IT applications development
- 5+ years in technical development and leadership roles on BI and Data Warehouse projects with significant experience in the majority of these activities
- O Designing star schema data models
- O Designing normalized data models for historical data warehouses
- O Design and Developing with ETL tools
- O Designing and developing with MOLAP/ROLAP/OLAP Tools
- O Designing and developing with Relational Reporting Tools
- Managing or coordinating the time and activities of other people and other groups
- ERWin for Data Modeling or similar
- Relational DBMS – experience with “appliance-like� platforms, such as Netezza (preferred), Teradata, or others
- Experience with leading ETL tools, such as SAP Data Services (preferred), MS SSIS, Informatica, etc
- Experience with leading OLAP/ROLAP tools (Business Objects, Microsoft, Microstrategy Hyperion Essbase or Cognos)
- Experience with relational reporting tools (Microsoft, Business Objects etc.)
- Experience with the major big data solutions like Hadoop, Map Reduce, Hive, Pig and other tools in the Hadoop ecosystem
- Understanding of major programming/scripting languages like Python, Scala
- Experience working with large data sets and distributed computing tools
- Knowledge on NoSQL platforms
- Basic knowledge of machine learning, statistics, optimization or related field is a plus
- Experience modeling data in SAP BW and SAP HANA
- Planning and administration of project tasks and dependencies
- Facilitating cross-functional requirement gathering meetings
- Developing relationships with internal Customers and Service Providers
10
Big Data Administrator / Architect Resume Examples & Samples
- Architect and evolve Rogers Enterprise Big Data Platform to support Enterprise data management, operational, reporting and analytical systems and applications
- Design, install, configure and administer Rogers Enterprise Big Data Hadoop platform: Dev, QA, Production clusters, applications and services in both physical and virtualized environments
- Implement best practices to design, install and administer services to secure Big Data environments, applications and users including Kerberos, Knox and Ranger
- Implement best practices to configure and tune Big Data environments, application and services, including capacity scheduling
- Install and configure high performance distributed analytical applications utilizing Enterprise Big Data platform, including commercial (SAS) and open source Machine Learning frameworks
- Work closely with hardware & software vendors, design & implement optimal solutions
- Conduct day-to-day administration and maintenance work on the Big Data environment
- Look after both incident & change management
- Monitor and meet service level targets
- Manage capacity utilization to ensure high availability and multi-tenancy of Big Data systems
- Perform capacity planning based on Enterprise project pipeline and Enterprise Big Data roadmap
- Provide technical inputs during project solution design, development, deployment and maintenance phases
- Help with purchase decisions based on business requirements
- Assist with preparing and reviewing vendor SOWs
- Assist and advise network architecture and datacenter teams during hardware installations, configuration and troubleshooting
- Actively participate in architecture and design of the next generation Rogers Big Data platforms
- A degree in Computer Science, Engineering, Systems Administration, Technology or a related field
- Hadoop 2.0 Administration certification
- 2+ years of Production Big Data Administration experience
- Hortonworks HDP 2.0 / YARN production Administration experience
- Minimum 1 year of production experience with Hadoop installation, performance tuning, configuration, optimization, job processing
- In-depth understanding of best practices and production experience with Hadoop cluster security frameworks (authentication and authorization): Kerberos, Knox, Ranger
- Production experience with Hadoop components and services: Hive, Pig, Hbase, Sqoop, Falcon, Oozie, Ambari
- Experience with Flume/Kafka/Storm is a strong asset
- Experience administering both virtualized and physical environments
- Strong virtualization skills and production experience with VMWare VSphere 5.1 / 5.5 is an asset
- Expert Linux / Unix administration skills and experience
- Expert Linux / Unix scripting skills
- Strong Systems Networking administration skills and experience
- Experience within the Telecommunication industry is an asset
- Highly motivated and very proactive individual, dedicated to follow-up/follow through without reliance on management for direction
11
Big Data Presales Architect Resume Examples & Samples
- Ability to gather and understand information discovery, business intelligence reporting, query, analytic, and real-time data processing requirements (including the underlying business requirements that drive selection of the tools) and effectively recommend solutions
- Ability to gather data management system requirements, position how and why Hadoop clusters, NoSQL databases, and data warehouses are deployed, and effectively recommend solutions
- Ability to gather data integration and data ingestion requirements from a variety of structured, semi-structured, and streaming data sources and effectively recommend solutions
- Ability to discuss tradeoffs in on-premise and cloud-based approaches, understand data transmission requirements, and suggest viable approaches
- Ability to apply information / enterprise architecture methodologies and best practices (such as TOGAF and Oracle’s OADP) during discovery and recommendations and prepare appropriate deliverable
- Ability to communicate with lines of business and technical audiences
- Ability to apply knowledge of certain industries and uncover the drivers of potential projects
- Ability to apply expertise in Oracle products and strategies (and / or those of competitors) that provide needed solutions
- Ability to work with other team members and drive initiatives and initial project planning including architecture, value determination, targeted demonstrations, and proof of value testing with well defined success criteria
- Any knowledge of some or all of the following products would be considered beneficial: Oracle Advanced Analytics, Big Data Appliance, Oracle Big Data Connectors, Oracle Big Data SQL, Cloudera Hadoop, Oracle NoSQL Database, Oracle Big Data Discovery, Oracle Data Integrator, Oracle Event Processing and Big Data cloud services solutions
- Financial Services Industry experience
12
Big Data Forward Architect, Paris / Suresnes Resume Examples & Samples
- Ability to both understand and anticipate requirements to make the global DIL offer evolve
- Identify key technical bricks to evaluate and implement in the platforms
- Participates in onboarding new entities
- Assists entities architects to design integration architecture to work with datas & digital assets
- Articulate pros and cons of various technologies and platforms
- Document use cases, solutions and recommendations
- Help program and project managers in the design, planning and governance of implementing projects of any king
- Perform detailed analysis of business problemes and technical environments and use this in designing the solution
- Experience with a wide range of big data architectures Hadoop and non Hadoop including HDFS, Redis, Pig, Hive, Impala, Mahout, Spark, Shark, R, Tableau and other big data frameworks
- Ability to work and get results in international teams with no hierarchical structures ,
- Ability to work in a multi-cultural environment
- To be creative and innovation minded
- To work with minimal direct guidance, self-motivated and proactive
- To work with in a collaborative model, side by side with the business
- Manage and participate to internal and external communities of experts
- To deal with competing priorities and pressure
- Fast adaptation to changing requirements
- Results and value oriented
- Leadership, influence and conflict résolutions
13
Data Architect for Big Data Systems Resume Examples & Samples
- Take the lead on designing data architecture solutions in compliance with firm wide data management principles and security controls for a big data environment consisting of Spark, HBase, Kafka, Impala etc
- Determine the best mix of big data software solutions to streamline data flow for different business use cases and data discovery across the firm while maintaining strict access controls
- Collaborating with technologists and data scientists to set up test plans and use cases to evaluate new Hadoop-native technologies for ETL, visualization, data science, discovery, etc
- Design the connection between the big data environments and metadata management/security platform to ensure efficient capture and use of metadata to govern data flow patterns and access controls across the firm
- Maintain the end-to-end vision of the data flow diagram and develop logical data models into one or more physical data repositories
- Document logical data integration (ETL) strategies for data flows between disparate source/target systems for structured and unstructured data into common data reservoir and the enterprise information repositories
- Define processes for the effective, integrated introduction of new data and new technology
- Ability to think through multiple alternatives and select the best possible solutions to solve tactical and strategic business needs
- BS Degree in Computer Science or Engineering
- 10+ years of hands on experience in Data Management/Data Architecture/Application architecture/Data Development
- 3+ years of experience working with modern big data systems for data management, data architecture, security and access controls
- Experience managing data transformations using Spark and/or NiFi and working with data scientists leveraging the Spark machine learning libraries
- Proficiency working within the Hadoop platform including Kafka, Spark, Hbase, Impala, Hive, and HDFS in multi-tenant environments and integrating with other 3rd party and custom software solutions
- Solid expertise in data technologies; i.e., data warehousing, ETL, MDM, DQ, BI and analytical tools. Extensive experience in metadata management and data quality processes
- Hands-on experience with dimensional modeling techniques and creation of logical and physical data models (entity relationship modeling, Erwin diagrams, etc.)
14
Software Architect, Big Data Resume Examples & Samples
- You are experienced with object-oriented design, coding and testing patterns
- You have a high bar for operational excellence; ability to quickly lead team through issues resolution, root cause analysis and successfully prioritize corrective action
- You're a great teammate with a dedication to grow others around them as much as themselves
- You enjoy mentoring junior engineers and promote a culture of continuous learning; take an active role improving the overall potential of the group
- You have a real passion for designing and leading the implementation of resilient, distributed software platforms and large-scale data infrastructures
- You have a desire to learn about, evaluate and appropriately drive team adoption of new technologies and methodologies
- You have a successful track record of engineering Big Data solutions with technologies like Hadoop, Hive & Spark
- 5+ years Java development experience
- Delivery of solutions built on AWS or other large-scale cloud platforms a plus
- Experience with streaming data ingestion, machine-learning, Apache Spark and Cassandra a plus
15
Solutions Architect, Big Data Specialist Resume Examples & Samples
- 7+ years’ experience of IT platform implementation in a highly technical and analytical role
- 5+ years’ experience of Big Data service or solution implementation is at least required
- Deep understanding of database and analytical technologies in the industry including Databases, noSQL, Data Warehouse design, BI reporting and Dashboard development
- Demonstrated ability to think strategically about business, product, and technical challenges in an enterprise environment
- Current hands-on implementation experience required
16
Technical Architect Big Data Solutions Resume Examples & Samples
- Above-average technical or scientific degree
- 2 years of hands-on technical experience estimating and conceptualizing large-scale data solutions, preferably using one or more of the following technologies: Hadoop, Spark, NoSQL
- Profound experience with data warehousing (minimum 4 years)
- 3 years track record in building and deploying solutions to Big Data problems, building and implementing architecture roadmaps for next generation enterprise data and analytics solutions as well as working with Cloudera, Hortonworks, DataStax or MapR
- Excellent communication, analytical and conceptual skills
- A strong command of both written and spoken English and German
- Team spirit and willingness to travel
17
Big Data Technical Architect Resume Examples & Samples
- Responsibility for the entire Development Architecture within the Data Lake
- Ensuring that all projects follow known design patterns and solutions
- Projects are delivered on time and on budget
- Understanding Business drivers, functional and non-functional
- Proponent for innovation, best practices, sound design with data & information optimisation in mind
- Develop data strategy and data architecture
- Extensive experience with the Hadoop eco-system and associated technologies such as Cassandra, MongoDB, NoSQL, Elasticsearch, etc
- Experience with any relational database
- Should be strong in conceptualising and problem solving
- Highly analytical, with structured thinking and decision making
- Team leading skills with ability to present comfortably to senior management
- Design scalable and reusable solutions on the Big Data Platform
- Lead solution governance activities, code implementation into production and follow through with bug fixes through relevant individuals responsible for the code
- Create and review technical design documentation to ensure the accurate development of Big Data solutions
- Provides solution and technology advice and guidance to the wider Big Data team
- Guide and mentor Big Data and consumer Development team members (e.g. BI) through technical solution deliverables
- Create, maintain and update development standards, process and product methodology
18
Software Architect Digital Pathology With a Passion for big Data Resume Examples & Samples
- You are responsible for the design of software on component or module level
- You have a deep understanding of the consequences of your design on the architecture
- You are responsible for communicating effectively the consequences of your design on the architecture
- You will design software, on the basis of design specifications in accordance with the functional specifications
- You will finalize the design specifications, code and test the designed modules or components, so that the software in question will be reliable, efficient, easy to maintain, and user-friendly
- Perform work in line with the product development or software engineering processes that have been agreed in the department
- You will #codeforcare
19
Think Big Principal Data Architect Resume Examples & Samples
- Primary and Lead Data and Solution Architect on a project. End to end data pipeline knowledge including metadata, security, data quality, data modeling, building custom views for applications and BI Tool presentation when needed
- Present the Kylo roadmap, vision, proposed data lake architectures and business outcomes to all levels – developers, architects, CTO, CIO
- Work directly with customer’s technical resources to devise and recommend solutions based on the understood requirements
- Document and present complex architectures for the customer’s technical teams
- Design and recommend approaches for big data ingestion strategies from any data source or type, including use of leading third party tools and their integration with overall metadata management
- Development of data flows and templates, often working with the customer’s developers to help them learn how to use Kylo
- Development of Java extensions to the Kylo framework
- Custom integration of the Kylo framework to customer systems
- Write and produce technical documentation
- Lead Kylo developer training in a classroom setting
- Work closely with Think Big’ teams at all levels to help ensure the success of project consulting engagements with customer
- Keep current with the Hadoop Big Data ecosystem technologies
- Travel up to 75%
20
Big Data, Technical Architect Resume Examples & Samples
- Provide broad insight and design recommendations for applications, including big data applications
- Own direct responsibility in terms of key technology choices and implementation
- Maintain standard compliance
- Design, code and debugging of applications
- Production server setup, performance tuning, improvement and automation
- Provide guidance, assistance, and technical direction to team members
- Work with various disciplines to provide technical insight on new initiatives
- Interact with key stakeholders to exchange information
- B.E/ B.TECH/ M.TECH/ MS or equivalent with 8+ years hands-on experience in Big data / Java / J2EE application design and development
- Completion of minimum of two to three full life cycle development projects
- Ability to effectively communicate and interact with user representatives and peers in design and feedback sessions
- Working knowledge of RDBMS, Java, Spring, Hibernate, HDFS and/or Mongo DB, Spark and other Apache technologies
- Knowledge of additional open source tools/platforms will be an added advantage
- Understand the business need for big data – what kind of data, how much data, types of algorithms to be run, load on the system, budget etc.- and recommend optimal solutions
- Build and implement the solution. This will need you to be hands on to build in quick prototypes/proof of concepts
- Work with the operations team to build systems, process and team required to run and maintain the systems securely, reliably and in a scalable manner
- Good understanding of infrastructure including server sizing
- Experience in database design/implementation; Version Control systems such as GIT, CVS or Subversion (SVN)
- Strong debugging, troubleshooting, and diagnostic skills
- Passionate about solving problems, quality and learning new technologies
21
Enterprise Architect, Big Data Resume Examples & Samples
- Participate in an ongoing partnership with the business to apply in-depth knowledge of the business operations, strategies, priorities and information requirements to establish the technical application direction
- Define enterprise application strategies
- Define the system, technical, and application architectures, and in some instances the business systems/process architecture for major areas of development, with a focus on application architecture
- Ensure appropriate technical application standards and procedures are defined
- Ensure best practices are adhered to in the adoption of new application technologies
- Research, evaluate and select application technologies (existing or emerging) that best fit business and IT strategic needs
- Ensure the delivery process and technology strategies are coherent and optimized
- Participate in developing and architecting application solutions in a multi-project, collaborative environment
- Design and assist with the delivery of proofs-of-concept for new or improved enterprise-wide technologies that are used across multiple areas of the business
- Architect application solutions across multiple hardware/software computing environments and system components; and
- Plan and implement process re-engineering or process improvement
- Must have 7+ years of application design, development, and architecture experience using Microsoft .NET platform or Java/J2EE technologies
- 3+ years of hands-on experience with the technologies in the Hadoop ecosystem like Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop, Cloudera Impala, Zookeeper, Oozie, Hue, Kafka
- Extensive knowledge of application architecture, design, and development
- Extensive knowledge and experience with Big Data Architecture, Distributed Architecture, MicroServices and EAI/EI
- Extensive knowledge of various technology architectures and hardware platforms, both existing and emerging
- Working knowledge of EA methodologies and tools, including TOGAF or Zachman architecture frameworks
- Working knowledge of relational and dimensional database concepts
- Must be a methodical and pragmatic problem-solver
- Must have a strong sense of teamwork, active listening skills and negotiation and influencing skills; and
22
Big Data Application Architect Resume Examples & Samples
- Leading all technical aspects of the project including interfaces
- Creating transition plans
- Providing ad hoc technical support
- At least 3 years experience designing/developing/delivering architecture/infrastructure/data integration/data management support environments including virtualized systems environment, Big Data databases, security and portals
- At least 3 years experience providing technical leadership on projects
- At least 3 years experience creating transition plans
- At least 3 years experience implementing SBMC2
- At least 8 years experience designing/developing/delivering architecture/infrastructure/data integration/data management support environments including virtualized systems environment, Big Data databases, security and portals
- At least 8 years experience providing technical leadership on projects
- At least 8 years experience creating transition plans
23
Solutions Architect Big Data Resume Examples & Samples
- Identifies discrepancies between the enterprise technical architecture and systems designs proposed by project teams, and assist project teams in resolving the discrepancies
- Designs real-time software applications on selected platforms
- Acts as an advisor to SPAWAR and DHA system engineers and proposes changes to the enterprise technical architecture based on analysis of requirements and new technology
- Expert in ETL design and implementation and direct teams
- Conduct performance reviews, leadership development and evaluations of direct reports
- 4-5 years of Agile Software Development
- Analytical and technical skills with the ability to analyze issues, assess technical risks, and deliver sound solutions in a timely manner
- Expertise ITIL process in incident management, problem management, change management and release management
- 8+ years of experience supporting big data platforms with exposure to the latest technology in in Big Data, Master Data Management (MDM) and Data Quality Services
- Experience with MHS and/or VA data and analytical uses
- Experience working with dashboards, visualization and reporting front end
- Experience with various BI tools such as Tableau, SSRS, SSAS, SAS, etc
- Expert in relational database concepts and SQL
- Exposure Strong leadership skills, decision making and problem solving abilities
- Experience with CMMI Level 3 delivery is a plus
24
Architect, Big Data Resume Examples & Samples
- Design and Build world class high-volume real-time data ingestion and processing frameworks and advanced analytics on big data platforms
- Data Management strategy defining direction of data organization; metadata management within Data Lakes
- Research, develop, optimize, and innovate frameworks and patterns for enterprise scale data analysis and computations as part of our big data and Internet of Things initiatives
- Lead the implementation of Hadoop Data model strategy by creating architecture blueprints, validating designs and providing recommendations on the enterprise platform strategic roadmap
- 3+ years of hands-on implementation experience working with a combination of the following technologies: Hadoop distributions, Storm and Spark streaming, Kafka, Spark advanced analytics, NoSQL data warehouses such as Hbase and Cassandra, data processing frameworks like Apache Nifi, Talend, Spring XD
- 1+ years’ experience in designing and implementing big data solutions. This includes creating the requirements analysis, design of the technical architecture, design of the application design and development, testing, and deployment of the proposed solution
- Experience in Data lakes and Data Vaults concepts
- Demonstrated leadership and interpersonal skills, including decision making, planning, organizing, influencing, mentoring, facilitating, collaborating and negotiating
- Deep understanding of database and analytical technologies in the industry including MPP databases, noSQL storage
- Big Data Security including Hadoop security and emerging technologies such as im-memory and NoSQL
- Programming language -- Java (must), Python, Scala, Ruby
- Batch processing -- Hadoop MapReduce, Cascading/Scalding, Apache Spark
- Stream processing -- Apache Storm, AKKA, Samza, Spark streaming
- NoSQL -- HBase, MongoDB, Cassandra, Riak,
- ETL Tools Data Stage, Informatica,
- Code/Build/Deployment -- git, hg, svn, maven, sbt, jenkins, bamboo
- Familiarity with Agile and DevOps practices (continuous integration, continuous delivery, automation)
- Togaf certification is an asset *LI-MK1
25
Data Architect, Big Data Resume Examples & Samples
- Validate the reference architecture and provide details for the architecture of platforms, leading and working hands-on towards implementation and delivery to production for our MapR platform
- Provide architecture/design for solving various business problems. The solutions will need to consider the full enterprise data hub platform including, but not limited to MapR, Data Virtualization and Drools/BRMS
- Help lead the charge on development strategy, ensuring rapid delivery while taking responsibility for applying standards, principles, theories and concepts
- Ensure proper/complete documentation is to standards; architecture, design, metadata, etc
- Innovative solutions to enable business to gain access to data in a rapid manner with long term vision in mind
- Delivery high quality comprehensive solutions
- Diligently teaming with the infrastructure, network, database, application and business intelligence teams to ensure architecture and designs exceed expectations
- Experience with the data/development architecture in the field of big data specific to Hadoop Zoo; MapR experience preferred
- Data definitions - catalogs, canonical structure, etc
- Indicate components of the zoo which are best suited to solve business problems
- Data lifecycle management
- Instance/container approach to utilize
- Data Lake and Data Hub definition
- Prior experience in migrating big data platforms from earlier to latest versions of the platforms by defining application code and/or other impacts
- Knowledge of SCM concepts using tools like Git, SVN etc
- Excellent written and verbal communication skills with ability to communicate technical issues to nontechnical and technical audiences
- Software Development Experience - 3+ Years (Unix/Linux scripting, python, SQL/TSQL)
- Hadoop/MapR; Spark/SQL, Sqoop, Hive, Drill, Impala, Flume, Jenkins, Kie, JBoss EAP, etc. - 2+ years
- BRMS/Drools - knowledge and/or experience is preferred
- TOGAF/DMBOK - knowledge and/or experience with standards is preferred
- SQL Server 2008 or newer - 3+ years
- Data Analysis - 3+ years (technical data analysis)
- SAP PowerDesigner, SAP Information Steward preferred
- Experience using tools such as RedGate, Atlassian (Jira), Wiki, Visual Studio, TFS, PragmaticWorks, Power Designer, Tortoise/SVN, Git, SQL Server
26
Big Data / AWS Architect Resume Examples & Samples
- AWS BIG DATA Platform
- Programming Experience in
- EMR
- Kafca
- Spark SQL
- RedShift
- AWS Orchestration
- Lamda Architecture including Machine Learning knowledge
- AWS Codebase Migrations for large scale BI/DWH projects
- Exposure in Project Management
27
Eaa-big Data-practice Architect Resume Examples & Samples
- Masters is a plus but not required
- 10+ years experience in information technology and/or IT professional services
- 6+ years in client facing roles with data architecture and providing project management and oversight within professional services
- 3+ years of hands-on big data technology experience
- Certification with one of the Hadoop distribution - Cloudera, MapR, Hotonworks
- Prior experience with Big Data ETL tools like Informatica BDM, Talend
- Prior experience with Spark ingestion/ compute
- Hands on experience with few of the tools like pig, flume, sqoop, oozie, Kafka, Nifi, minifi, Impala, Scala, etc
- Prior experience with R, Python is a plus
- Hadoop administration skills is a plus
- Experience with NoSQL database like Cassandra, MongoDB, NuoDB, Couchbase, HBase, Redis is a plus
- Experience with various technology platforms, application architecture, design, and delivery including experience architecting large big data enterprise data lake projects
- Strong writing and client facing communications with the ability to effectively develop and maintain client relationships
- Excellent analytical and problem solving abilities
- Able to think and act strategically
- Action oriented and able to prioritize while handling multiple tasks
28
Big Data Infrastructure Architect Resume Examples & Samples
- Hands on experience with designing and implementing distributed architecture systems to terabyte/petabyte using OpenSource Software
- Experience in full life cycle Hadoop Solutions: requirement analysis, platform selection, design future state application and enterprise architecture, testing and deploying solution
- Expert knowledge in modern distributed architectures and compute / data analytics / storage technologies on AWS
- Knowledge of a programming and scripting languages such as Java/Python/Perl/Ruby/linux
- Understanding of architectural principles and design patterns using frameworks such as Hadoop / Spark and/or AWS EMR
- Knowledge of SQL ( MS SQL, PostgreSQL, mySQL) and NoSQL databases (HBase, DynamoDB, Cassandra)
- Knowledge of technical solutions, design patterns, and code for applications in Hadoop
- Experience in architecting and building data warehouse systems and BI systems including ETL (Inofmratica, Talend)
- Software Development Lifecycle (SDLC) experience
- AWS Architecture / Azure Architecture experience ideally with the appropriate vendor certification
- Understanding of hybrid cloud solutions and experience of integrating public cloud into tradition hosting/delivery models
- Experience as principal technical lead on at least one major project
- AWS or Azure trained / certified architect – e.g. Amazon Certified Solutions Architect – Associate or Professional Level
- AWS Redshift experience
- Oozie, Flume, ZooKeeper, Sqoop and/or R
- Hands on experience designing, developing, and maintaining software solutions in Hadoop Production cluster
- Experience of implementing architectural governance and proactively managing issues and risks throughout the delivery lifecycle
- Good familiarity with the disciplines of enterprise software development such as configuration & release management, source code & version controls, and operational considerations such as monitoring and instrumentation
- Experience of consulting or service provider roles (internal, or external)
- Strong knowledge on software development methodologies like Agile/Scrum, Kan Ban etc. Broad understanding of enterprise project lifecycle
- AWS certification in any of the following - Solutions Architect, Developer or Systems Ops
- A good degree-level education is highly desirable
- Team lead and program or project management experience
- Experience using database technologies like Oracle, MySQL is required and understanding of NoSQL, MongoDB is preferred
- ISO27001/2 certification
- Redshift / Data warehouse experience
- Extensive automation experience with either Chef or Puppet
- Experience in designing or implementing data warehouse solutions is highly preferred
29
Senior Architect / Big Data Software Engineer Resume Examples & Samples
- Rapidly architect, design, prototype, and implement architectures to tackle the Big Data and Data Science needs for a variety of Fortune 1000 corporations and other major organizations
- Work in cross-disciplinary teams with KPMG industry experts to understand client needs and ingest rich data sources such as social media, news, internal/external documents, emails financial data, and operational data
- Research, experiment, and utilize leading Big Data methodologies, such as Hadoop, Spark, Redshift, Netezza, SAP HANA, and Microsoft Azure
- Architect, implement and test data processing pipelines, and data mining / data science algorithms on a variety of hosted settings, such as AWS, Azure, client technology stacks, and KPMG’s own clusters
- Translate advanced business analytics problems into technical approaches that yield actionable recommendations, across multiple, diverse domains; communicate results and educate others through design and build of insightful visualizations, reports, and presentations
- Develop skills in business requirement capture and translation, hypothesis-driven consulting, work stream and project management, and client relationship development
- Bachelor’s degree from an accredited college/university in Computer Science, Computer Engineering, or a related field and minimum seven years of big data experience with multiple programming languages and technologies; or Master’s degree and a minimum of five years of experience; or PhD in Computer Science, Computer Engineering, or a related field with minimum three years of big data experience
- Fluency in several programming languages such as Python, Scala, or Java, with the ability to pick up new languages and technologies quickly; understanding of cloud and distributed systems principles, including load balancing, networks, scaling, in-memory vs. disk, etc.; and experience with large-scale, big data methods, such as MapReduce, Hadoop, Spark, Hive, Impala, or Storm
- Ability to work efficiently under Unix/Linux environment or .NET, with experience with source code management systems like GIT and SVN
- Ability to work with team members and clients to assess needs, provide assistance, and resolve problems, using excellent problem-solving skills, verbal/written communication, and the ability to explain technical concepts to business people
- Ability to travel up to 80%
30
Senior Big Data AWS Architect Resume Examples & Samples
- Hands on experience with Bigdata technologies including administration, configuration management and production troubleshooting and tuning
- Hands-on experience working with AWS Cloud technology including S3, EMR etc
- Knowledge and understanding of Java, Python, Linux
- Experience in benchmarking systems, analyzing system bottlenecks and propose solutions to eliminate them
- Be able to clearly articulate pros and cons of various technologies and platforms
- Hands-on Experience in technologies like: Spark, Hive, Pig, Kafka, R, Storm
- Knowledge of traditional data analytics warehouses like Teradata
- Be able to document use cases, solutions and recommendations
- Liaison with Project Managers and other solution architects in planning and governance activities related to the project
- Bachelors in computer science, data science, business intelligence or related technical field. 5+ years of experience in Data Analytics field