Pay as you go
- Maximum flexibility to learn at your own pace.
- Cancel anytime.
At 5-10 hours/week
Get access to the classroom immediately upon enrollment
Intermediate Python and SQL skills, and basic familiarity with ETL/Data Pipelines.
Learn about the principles of data architecture. You will begin by learning the characteristics of good data architecture and how to apply them. Next you will move on to data modeling. You will learn to design a data model, normalize data, and create a professional ERD. Finally, you will take everything you learned and create a physical database using PostGreSQL.
Learn to design enterprise data architecture. You will build a cloud based data warehouse with Snowflake. You will evaluate various data assets of an organization and characteristics of these data sources, design a staging area for ingesting varieties of data coming from source systems, and design an Operational Data Store (ODS). Finally, you will learn to design OLAP dimensional data models, design ELT data processing that is capable of moving data from an ODS to a data warehouse, and write SQL queries for the purpose of building reports.
Learn about how to help organizations with massive amounts of data, including identification of Big Data problems and how to design Big Data solutions. You will learn about the internal architecture of many of the Big Data tools such as HDFS, MapReduce, Hive and Spark, and how these tools work internally to provide distributed storage, distributed processing capabilities, fault tolerance and scalability. Next you will learn how to evaluate NoSQL databases, their use cases and dive deep into creating and updating a NOSQL database with Amazon DynamoDB. Finally, you will learn how to implement Data Lake design patterns and how to enable transactional capabilities in a Data Lake.
Learn how to design a data governance solution that meets your company’s needs. First, you will learn about the different types of metadata, and how to build a Metadata Management System, Enterprise Data Model, and Enterprise Data Catalog. Next you will learn how to perform data profiling using various techniques including data quality dimensions, how to identify remediation options for data quality issues, and how to measure and monitor data quality using data quality scores, thresholds, dashboards, exception and trend reports. Finally, you will learn the concepts of Master Data and golden record, different types of Master Data Management Architectures, as well as the golden record creation and master data governance processes.
Data Architect / Analytics Consultant
Benjamin has over 15 years of experience working as a data professional in fields including medicine, telecomm, and finance, in roles ranging from data architect to data scientist and analytics consultant. He holds a Ph.D. in Decision Sciences, where his research was focused on rare event detection.
CEO at OK2
Shankar Korrapolu is the co-founder and CEO of startup OK2, a cross-platform mobile gaming engine that builds games cheaper and faster without compromising quality. For over 30 years he has offered his data processing services to organizations in investment banking, pharma, government and education sectors.
Senior Data Architect
Shrinath is an entrepreneur and Data Architect passionate about helping enterprise companies transform and engineer their big data analytics applications on Cloud. He has worked with AWS, Google and Microsoft cloud platforms, has over 15 certifications and an MS in Computer Science from The University Of Texas at Dallas.
Founder & Principal Data Architect
Vijaya is the Founder and Principal Data Architect for Great View Data Corp., which provides Data Architecture consulting and implementation services. Vijaya has extensive experience with creating architecture strategy and roadmaps, establishing frameworks and best practices, and Data Management.
Principal Data Architect
Rostislav is an Enterprise Data Architect and Data Management Leader whose expertise covers data governance, architecture, and integration practices across a diverse range of technologies. He has worked at companies of all sizes and in various industries. His musings can be found at learndataarchitecture.com
Nick has built and managed teams of scientists and engineers for political campaigns, social media, and supply chain companies. With experiences ranging from startups to Amazon, he balances speed and scale. In his free time, Nick enjoys teaching graduate statistics courses at both Columbia and Yeshiva Universities.
Pay as you go
The Data Architect Nanodegree program is designed for students with intermediate Python and SQL skills, and basics of ETL/Data Pipelines including: