Datacast Episode 130: Towards Accessible Data Analysis with Emanuel Zgraggen
The 130th episode of Datacast is my conversation with Emanuel Zgraggen, the CEO and Co-Founder of Einblick - a visual computing platform that enables data teams to answer tougher, more meaningful questions by making advanced analytics and model building more streamlined and accessible.
Our wide-ranging conversation touches on his upbringing and early career in Switzerland, his time as a graduate researcher at Brown University building interactive tools for visual data exploration, his stint at MIT as a post-doc, his current journey with Einblick building the visual computing platform for modern data analysis, lessons learned from hiring and fundraising, advice for commercializing an academic research project, and much more.
Please enjoy my conversation with Emanuel!
Listen to the show on (1) Spotify, (2) Google, (3) Deezer, (4) TuneIn, and (5) iHeartRadio
Key Takeaways
Here are the highlights from my conversation with Emanuel:
On His Apprenticeship Work in Switzerland
I was born and raised in Switzerland, just outside the city limits of Zurich, in a small town. The education system in Switzerland is quite unique compared to the US. In secondary school, which is similar to high school in the US, there are two tracks to choose from.
The first track is a more standard academic path, which involves six years of high school followed by attending university. The second track is a vocational path, where students complete three years of high school and then pursue an apprenticeship in a field of their choice. For example, if someone wants to become an electrician, they would do an apprenticeship with a company in that field. The apprenticeship typically involves a combination of on-the-job training and attending vocational school one or two days a week.
When I finished primary school, I had to decide which track to choose. This decision depended on various factors such as grades and personal preferences. To pursue the academic path, I had to take an exam, which, unfortunately, I failed despite doing decently in school. As a result, I opted for the vocational track and didn't retake the exam the following year.
After three years of high school, when I was 15, I had to decide what type of apprenticeship to pursue. Initially, I wasn't sure, but then I came across business apprenticeships offered by some of the larger financial companies in Zurich. These apprenticeships combined business skills with IT and software development. Although I had always been interested in computers and enjoyed playing video games, I hadn't written any code before. However, the idea of learning both business and IT skills intrigued me.
I applied for an apprenticeship with Credit Suisse. During the apprenticeship, I was rotated through different departments, spending six months in the back office, six months as a bank clerk, six months in an IT group, and six months in a software engineering team. Additionally, I attended internal schools where I learned coding and software development.
This experience sparked my interest in computer science, and I thoroughly enjoyed the process of writing software and code.
On His Education at HSR (University of Applied Sciences Rapperswil)
It's similar to the Swiss education system, where there is a dual system for higher education. There are academic universities and more vocational colleges or universities. The school I attended falls into the latter category: vocational college. It is a small engineering-focused school with around 700 students.
My degree was focused explicitly on software engineering. Unlike my later experience during my master's in the U.S., where I had around 12 hours of lectures per week, this particular school had approximately 30 hours of lectures per week, along with numerous labs.
I enjoyed it because it allowed me to delve deeper into programming and software engineering. I started to understand the background more and more. However, I didn't particularly like the strong vocational focus. Sometimes, instead of teaching the fundamentals and theory, the emphasis was on teaching specific technologies that were in demand in the Swiss industry at the time, such as Java Swing and Java Enterprise Edition skills, which quickly became outdated.
Interestingly, I actually enjoyed the math calculus classes. We had a great professor who made math enjoyable and helped us appreciate it more, even though I wasn't particularly strong in that subject.
On the other hand, I didn't enjoy the physics classes as much since it wasn't my strong suit. However, I thoroughly enjoyed the programming and software engineering-oriented courses because writing code for different exercises was fun.
On Transitioning To Graduate School In The US
It was a lot of fun to apply what I learned at university in real life. I really enjoyed that time, maybe even more than I expected.
During that time, I made many good friends and colleagues working at different companies. I also took on some freelance projects, which added to the excitement. At the same time, after completing my degree, I realized that I wanted to delve deeper into computer science and explore the academic side of things.
Early on, I knew I would eventually want to pursue a master's degree or something similar. It was always in the back of my mind that I would go down this path. After a few years of working and applying what I had learned, I felt I was in a good place.
However, I was seeking a new challenge and started exploring the easiest way to pursue a master's degree at a more academic university or program. Due to the dual system of higher education in Switzerland, it was difficult for me to gain admission to a master's program at an academic university in Switzerland.
I would have had to retake several undergraduate classes from that particular university to be eligible. This led me to consider programs abroad. I discovered that pursuing a master's degree in Switzerland would take around three years, whereas doing it elsewhere would only take a year and a half. From a practical standpoint, it would be better to explore this opportunity abroad.
The idea of going abroad grew on me because it would give me the chance to improve my English. I was working at Credit Suisse, where English was the primary language of communication. Becoming more proficient in English seemed like a valuable skill to have. Additionally, experiencing a different country and immersing myself in a new culture sounded exciting.
Ultimately, I applied to several universities in the US, Canada, and even New Zealand. However, I also received a generous scholarship in Switzerland targeted explicitly towards individuals pursuing graduate degrees abroad. Everything fell into place, and I was accepted into Brown University. I decided to seize this opportunity and pursue my master's degree there.
On His Ph.D. Research at Brown
To answer the first question, my decision to go to Brown University was quite random. I initially planned to stay there for a year and a half to complete my master's degree and then return to Switzerland to find a job.
However, during my first week at Brown, I had the opportunity to meet several professors, including Andy van Dam. As part of my master's degree, I needed to work on a research project for my thesis. During casual conversations with different professors, I ended up joining Andy's lab and group. I had a great experience working with Andy, who eventually convinced me to stay and pursue a PhD.
Throughout my year and a half at Brown, Andy significantly influenced my decision to continue with a PhD. It was a random coincidence that I never intended to pursue a PhD, but working with Andy changed my perspective and made it seem like a great idea.
Regarding your second question about a common thread in my research, I have always been interested in creating innovative and impressive demonstrations. I focused on developing visually appealing and interactive user interfaces that challenge traditional norms. I aimed to incorporate the intuitive interaction style of devices like iPhones and tablets into conventional user interfaces, emphasizing visual elements and fluid interactions.
This theme was prevalent throughout my research, including my work on handwriting recognition. I explored how to utilize the familiar act of writing with a pen to interact with computer systems.
By continuously creating impressive demos, I became increasingly passionate about making data analysis and data science more accessible. I aimed to address the challenges that arise during the process.
On His Proudest Research Paper
The paper I am most proud of is the first one I ever wrote, titled "Panoramic Data." It ended up being an early predecessor of some ideas that went into the Einblick product. The system allowed users to visually create SQL queries, chart data, and perform various common data analysis tasks through a user-friendly interface that incorporated pen-and-touch interaction.
The reason I am particularly fond of this project is not because I consider it to be the best paper I have ever written or because it is the most exciting, but rather because it was my first research paper and the most challenging to write. Coming from a non-academic background, I had never written a research paper before in my career. Going through the process, including the frustrations of conference rejections, taught me a great deal about research and how to write papers that resonate well with other researchers. It took me a long time and was a difficult process, but that is why I take pride in it. It laid the foundation for all the things I learned while working on it.
Over time, I have learned that Andy, my advisor, tends to be hands-off. He allows you to work on whatever interests you and pursue your own ideas, which is great. However, developing the habit of writing papers can be challenging without someone guiding you through the process, providing feedback, and teaching you how to structure a research paper.
Fortunately, Andy was excellent at connecting me with other mentors who later became instrumental in my research journey. One of them was Steven Drucker, a researcher at Microsoft Research, who filled the role of guiding me through the paper-writing process. As an early PhD researcher, having someone who can provide practical guidance on writing a successful paper is extremely valuable.
On Working At The Intersection of HCI, InfoVis, and Data Science
I have always been interested in human-computer interaction and interfaces in general, which is why I started working with Andy. At that time, Andy had received grants from Sharpe and Microsoft. Both companies had launched big interactive whiteboards, ranging from 50 to 80 inches in size, with touch and pen support. They gifted us a couple of these devices and provided funding for us to develop exciting research ideas on utilizing these tools and interfaces.
While brainstorming ideas for what would work well on such a large screen, we realized that data visualization could be a great way to leverage the screen real estate of these devices. This led us to delve deeper into the topic of using big interactive whiteboards and pen-and-touch interaction to work with and visualize data. Later on, we also explored machine learning and data science. That's how it all started. It was a product of circumstances I found myself in.
During my time at Credit Suisse, one of the last tasks I worked on was similar to what is now known as a data engineer. My role involved extracting data from Bloomberg, formatting it correctly, and storing it in the appropriate database. This involved a lot of ETL (Extract, Transform, Load) work. Although this is now considered part of data science, at the time, I had more experience with working with large datasets, databases, and SQL rather than visualizations or machine learning. During my time at Brown University, I learned more about visualizations and machine learning. I worked closely with Tim Kraska, a Brown professor who later moved to MIT and became a co-founder of Einblick. Tim was more focused on data and machine learning, so I learned a lot from him in that aspect.
On Interning at Microsoft Research
Overall, I had a great time at Microsoft Research. I want to mention Steven Drucker, who was my mentor there. He really helped me in my career and was an incredible mentor. One thing that stands out is the value of collaboration in research at Microsoft Research. It was a place where I could openly share ideas, brainstorm with others, and listen to their feedback and opinions. This influenced the direction of my research. If one thing stands out, it's the idea that good research happens when people collaborate.
Looking back, I also realized the advantage big companies have over academic institutions regarding research. At Microsoft, we conducted a user study with product managers who used product telemetry data to analyze their products. Finding participants within Microsoft was easy. We sent an email to a mailing list, and within a day, we had 15 people signed up for our user study and access to their data. It was much easier to access these resources than doing something similar at a university, where access to data and people working with that type of data is more challenging. This also stands out from my experience interning at Microsoft.
On Being A Post-Doc at the CSAIL at MIT
For me, transitioning from Brown to MIT was actually relatively easy. This was mainly because I had already worked closely with Tim Kraska while he was at Brown. The Northstar project actually started at Brown before Tim moved to MIT. I ended up following him there, so it all made sense.
Getting accustomed to a new university and research team was smooth because I already knew the people there, or at least Tim and his colleagues. However, being a postdoc at MIT felt more like a job than being a student. It's a different role, and you don't feel as deeply connected to the university culture as you would as a student or professor.
Despite that, I had a great time at MIT. The database group was fantastic, and I learned a lot from discussions with other postdocs and PhD students. It's an excellent environment for research.
On The Founding Story of Einblick
It all started with the project we previously discussed, the Northstar project, which was sponsored by a DARPA grant for a long time. The objective of the grant was to build systems that would make data science more accessible.
As you can imagine, there is a vast amount of data in the government, but there aren't enough data scientists to address all the questions that may arise. The DARPA team was searching for ways or tools to empower their existing government workforce to be more productive with data. For example, how to enable regular data analysts to be effective with the latest advancements in machine learning. Throughout this grant, our research group grew and became tightly knit.
In this particular Northstar project, all became co-founders of Einblick later on. We strongly believed in the value of what we were doing with this project. Eventually, it became evident that people wanted to use some of the prototypes we built. However, we could not support this as part of an academic research team. Things like providing support for password resets for users were not our concern as part of the research. Our goal was to do enough to write a paper about it. We didn't have the necessary infrastructure to support people using the product.
At that time, we faced a decision. The DARPA project was ending, my postdoc was coming to an end, and others were finishing their PhDs. We were at a crossroads: should we let it die as an academic project or keep it going? To continue, we had to spin it out into a company.
Linking it back to the previous questions, I had never considered doing a startup. I was focused on staying in research. However, after discussing it with others and having early conversations with the folks at Amplify Partners, who eventually led our seed round, the idea grew on me and the rest of the team. We saw the potential.
What really solidified the idea for me personally was when we were invited to the Strata data science conference by O'Reilly, thanks to Ben Lorica. We were invited as an academic project to have a booth among startups and established companies. The conference had a strong focus on data science practitioners.
Over two or three days, we gave countless demos of our prototype to hundreds of people. The response was overwhelmingly positive. Everyone wanted to try it out and showed great interest. This convinced me that spinning this into a company could be a good idea.
To some extent, it was natural because each of Einblick's co-founders already had different areas of specialization in the research project.
Zeyuan Shang, for example, is a backend person particularly interested in scaling computation and automating machine learning. He took charge of that aspect of the company and built out that part of the system.
Philipp Eichmann, who also did his PhD with Andy at Brown, had a similar background and focused on UI/UX. He became the leader of the UI front-end team.
Benedetto Buratti, on the other hand, had a more theoretical data science research background. He became our AI evangelist and top data scientist, engaging with customers and providing consulting services to companies interested in using Einblick in the data science space.
On Building An Integrated Environment for Descriptive, Predictive, and Prescriptive Analytics
The way to think about the product we're building is like a combination of Miro, one of those whiteboarding tools, and a Jupyter notebook. It offers the same UI modality as a whiteboard, allowing you to zoom in, zoom out, pan around, and quickly lay out your analysis. However, it is powered by the full capabilities of Python that you would expect from a Jupyter notebook.
In terms of technical challenges, there are two main areas to consider. Firstly, there are interesting UI challenges in presenting all the functionality and options in a way that is not overwhelming and easy to learn. We want to make it simple for someone from a Python notebook background to jump into the platform without feeling overwhelmed. How can we achieve that?
Secondly, we need to figure out how to manage and execute these diverse workflows efficiently on the backend side. The platform should be able to handle large datasets and deliver results quickly to the user without them having to wait and stare at progress bars for extended periods. It's a complex systems challenge, as various aspects exist, including UI, backend execution model, and more.
Overall, we aim to strike a balance between providing a user-friendly interface and ensuring efficient execution of workflows, taking into account the diverse requirements of both frontend and backend aspects.
On Real-Time Remote Collaboration
Generally, I firmly believe that good data science can only happen when people with different backgrounds, experiences, and expertise levels work closely together.
The method of a stakeholder simply telling a data science team what solution they want and then the team working on it independently for a few weeks before presenting the results often leads to the realization that some initial assumptions were completely wrong. This kind of rigid process in data science is bound to fail. It is important to quickly respond to questions and to collaborate closely between technical experts and domain experts to ensure optimal results.
In such situations, a tool like Einblick can act as a shared language for people with different backgrounds to collaborate and work on data problems. It is easy to start a new canvas, similar to a new Miro board, in a meeting, pull data from different sources, visualize it, and iterate on data problems and prototype solutions. If you're satisfied with the results, you can then proceed to make them more robust. But for initial solutions, a tool like Einblick can greatly facilitate collaboration and ensure that everyone is on the same page when it comes to solving a particular data problem.
The video aspect of Einblick turned out not to be that complicated. What was more challenging for us was implementing the multiplayer mode, similar to what you see in Figma or Miro, where multiple people can view and edit the same content simultaneously. Ensuring that when someone makes a change, everyone sees the update in real time was a difficult task. However, overlaying videos on top of mouse cursors or other elements ended up being relatively simple. There are video services available that provide APIs that can be integrated into the tool. Overall, the multiplayer mode was a significant technical challenge to overcome, as it required accurately reproducing the state of a canvas on different clients without any corruption or issues.
On Building A Collaborative Visual Canvas
There are several common pain points with notebooks. As you mentioned, a paper from Microsoft effectively addresses some of these issues.
One such pain point is that using a notebook for quick exploration and analysis can sometimes feel unnecessarily time-consuming. You often have to repeatedly install different packages for the same analysis.
A lot of boilerplate code also needs to be written, whether for creating visualizations or quickly trying out a random forest. This repetition and boilerplate code can be frustrating.
Another challenge is reproducing results and reusing code in notebooks. It can be difficult to reproduce a result in a notebook with hundreds of cells because the execution flow is not clearly defined.
Furthermore, collaboration in notebooks is lacking. While you can share notebooks through Git, there is no real-time collaboration feature. This makes it challenging to discuss notebooks with non-technical stakeholders.
Presenting a Jupyter Notebook to less technical stakeholders can be problematic as well. It often involves showing them lines of Python code that may be difficult to understand and verifying assumptions made in the analysis.
Although notebooks have their strengths, they also bring up several pain points. However, we have a platform that can address some of these issues.
We have a built-in multiplayer mode and video/audio collaboration, making it easy to collaborate even when a lot of code is involved.
We also have the concept of building data flows, or execution flows, allowing for easy result reproducibility.
Additionally, we have made importing existing work from Jupyter into Einblick easier, creating a smoother transition for data scientists.
On Commercializing Academic Research
Transforming an academic or research project into a fully-fledged product can be incredibly challenging and time-consuming for various reasons. In research, there are certain aspects that we may not prioritize or care about. For example, in a research prototype, I may not focus on allowing users to reset their passwords because my main goal is to build something that can be shown and studied rather than ensuring it is perfectly secure for everyday use.
When we initially faced this challenge, we had a prototype that showcased many impressive features and received positive feedback. However, it was not yet suitable for practical use by anyone. We dedicated a significant amount of time, around a year and a half, to revamp and re-architect the prototype in a way that would support repetitive and real-world use cases.
Finding the right balance between how complete the features needed to be before users could actually utilize and test them was a significant early struggle. Determining the trade-off point between how polished and feature-complete our product should be was essential. Overcoming this challenge was an interesting endeavor.
On Go-To-Market
Early on, we had two significant advantages. First, being funded by MIT provided excellent support for startups to connect with companies. MIT's industrial liaison program, which hundreds of companies, big or small, are part of, and the associated conferences, made it easy for us to interact with companies. We took full advantage of this opportunity, and some of our earliest adopters came through this program.
The second advantage was being part of DARPA. This program gave us access to nonprofit and government organizations as early adopters. These two factors played a crucial role in allowing us to engage with potential customers at an early stage.
Our go-to-market strategy has changed a bit. We offer a free version where anyone can create an account and try our product. The goal is to get users to experience the product quickly and determine if they like it. They can consider purchasing a subscription or an enterprise version if they do. We are following a product-led growth (PLG) go-to-market model, which differs from our earlier approach in the company's life cycle.
For those familiar with Jupyter or Python notebooks, Einblick offers an easy transition and addresses pain points associated with Python notebooks. We encourage everyone to visit our website, create an account, and start exploring the platform.
Our target audience primarily consists of individuals working with Python notebooks on a daily basis and seeking a tool to overcome the challenges they face. Due to our academic background, we also attract data science learners who are early in their careers and want to enhance their knowledge. Additionally, our product appeals to those teaching data science courses as our collaboration features simplify the teaching process. However, our core target persona is definitely the data scientist.
On Hiring
Hiring was one of the things that surprised me the most when transitioning from academia to a startup. In my naive mindset, I thought that once we secured seed funding, we could easily hire talented individuals. However, it turned out to be more challenging than expected. People have many options, and numerous startups are doing interesting work. Convincing individuals to join our company, believe in our vision, and commit to it requires much hard work. Therefore, I dedicated a significant amount of time early on to focus on building the team and the hiring process.
We strive to be open and honest and have a fun work environment. We strongly believe that attracting smart and hardworking people will, in turn, attract others who share those qualities. Since we are still a small team, we must prioritize how well we can collaborate with potential hires, especially during challenging times. Finding individuals who excel in communication, honesty, and openness is crucial.
One aspect that made our hiring process unique is that we, as former academics, have limited industry and startup experience. Therefore, we made it a priority early on to bring in individuals who possess startup and industry experience. We sought people who have been through similar experiences multiple times. This emphasis on experience was another critical factor during our early hiring stages.
On Academia vs. Startup Culture
One challenge when transitioning from academia to a startup is that in academia, you focus on coming up with solutions. However, it is not always clear what problem you are solving and who has it in a startup environment. Sometimes, you can become so invested in the solution you are building that you lose sight of the intended audience.
To overcome this, adopting a customer-first approach and engaging with as many people as possible is crucial. It is important not to become fixated on a solution you believe will work. This is something we need to address early on.
On the other hand, some of the methods taught in academia, such as running experiments and validating through data, translate well into the startup culture. Constantly running experiments and reevaluating based on data are essential in a startup environment. Therefore, academic training in these areas can be highly beneficial.
On Fundraising
One thing that has been helpful for me, or for us in general, is that we haven't done this before. Talking to people who have gone through a similar journey of spinning an academic project into a company is beneficial. Try to find individuals like that, have conversations with as many as possible, and listen to their experiences, advice, and what they did or wish they had done differently.
Finding an investor with experience working with your specific background or story is essential. In our case, finding an investor with expertise in working with technical co-founders and even academics with some startup experience was crucial. Finding someone like that was important for us.
Additionally, asking for references from investors and speaking with companies in their portfolios can be helpful. It's actually important to talk to companies that haven't been successful as well. Talk to former founders of portfolio companies that didn't do well in order to gather opinions on how that investor operates when things are going well, but also when things aren't going well, to get a better sense of their approach.
From a tactical standpoint, timing is crucial. We didn't do our best to ensure that all our conversations with different venture capitalists were at the same stage. So, make sure that if you receive a term sheet from one investor, you're not in a position where there's another investor you've been talking to, but they're not ready to give you a term sheet yet, and your other term sheet is about to expire. A certain strategy manages a process like this and ensures the timing aligns well.
On Being A Researcher vs. Being A Founder
The two topics we previously discussed are highly relevant here. One is the importance of experimentation, scientific approach, and continuous data analysis. This approach is effective in both cases. Additionally, the skills learned in academia are valuable. Conducting numerous experiments and being scientific in your approach are key.
Another important aspect is building teams. Coming from institutions like MIT and Brown University, it was always easy to find highly qualified and motivated students to work on research projects. However, hiring and finding people is much harder in the startup world. As mentioned, top talent has many options and is not actively looking for a job. Therefore, it is crucial to sell and convince them that your company is the right fit and that your vision is compelling.
Show Notes
(01:59) Emanuel reflected on his upbringing in Switzerland and his 4-year apprenticeship in Software Engineering and Business at Credit Suisse.
(06:09) Emanuel recalled his 4-year program in Computer Science at HSR (University of Applied Sciences Rapperswil).
(08:43) Emanuel touched on his decision to pursue a Master’s degree at Brown University in the US.
(11:58) Emanuel explained his decision to continue with a Ph.D. degree at Brown under the advisement of Professor Andy van Dam and summarized the arc of his Ph.D. research focus.
(14:50) Emanuel highlighted his first research paper called PanoramicData on Interactive Data Exploration.
(16:42) Emanuel shared his thoughts on common traits of a successful researcher.
(18:03) Emanuel emphasized the focus of his research at the intersection of Human-Computer Interaction, Information Visualization, and Data Analysis.
(20:38) Emanuel shared valuable lessons from interning twice at Microsoft Research in Redmond.
(22:34) Emanuel talked about his time as a postdoc in Professor Tim Kraska’s group at the CSAIL at MIT.
(24:59) Emanuel shared the founding story of Einblick - a visual computing platform that enables data teams to answer tougher, more meaningful questions by making advanced analytics and model building more streamlined and accessible.
(29:06) Emanuel touched on the responsibilities of Einblick's 5 co-founders.
(30:23) Emanuel highlighted technical challenges of building Einblick's integrated environment for descriptive, predictive, and prescriptive analytics.
(32:09) Emanuel mentioned the collaboration challenge in data and brought up Einblick's real-time remote collaboration through video-enabled data whiteboards.
(35:39) Emanuel highlighted the challenges of working with computational notebooks and brought up the benefits of using Einblick's collaborative visual canvas.
(38:54) Emanuel unpacked the challenges of commercializing an academic research project.
(40:27) Emanuel gave a broad overview of Einblick's go-to-market strategy.
(43:55) Emanuel shared valuable hiring lessons to attract the right people who are aligned with Einblick’s cultural values.
(47:05) Emanuel shared fundraising advice to founders who are seeking the right investors for their startups.
(49:20) Emanuel shared the similarities and differences between being a researcher and being a founder.
(50:43) Closing segment.
Emanuel's Contact Info
Einblick's Resources
Notebook Feature Release (2022)
Mentioned Content
Papers and Projects
PanoramicData is a hybrid pen and touch system for visual data exploration (Infovis 2014 Paper | Video)
(s|qu)eries (pronounced “Squeries”) is a visual query interface for creating queries on sequences (series) of data based on regular expressions (CHI 2015 Paper | Summary Video)
Vizdom is an interactive visual analytics system that scales to large datasets through progressive computation (VLDB Demo 2015 Paper | Health Video | Election Video)
Tableur is a spreadsheet-like pen- and touch-based system that revolves around handwriting recognition - all data is represented as digital ink (CHI 2016 LBW Paper | Video)
Towards Accessible Data Analysis (Emanuel's Ph.D. Dissertation at Brown, 2018)
Northstar is an interactive data science plattform that combines data exploration with automated machine learning (SIGMOD DEEM Paper | Video)
People
Books
"The Book of Why" (by Judea Pearl)
"The Signal and The Noise" (by Nate Silver)
Notes
My conversation with Emanuel was recorded back in late 2022. Since then, I recommend checking out the launch of Einblick Prompt and ChartGenAI.
About the show
Datacast features long-form, in-depth conversations with practitioners and researchers in the data community to walk through their professional journeys and unpack the lessons learned along the way. I invite guests coming from a wide range of career paths — from scientists and analysts to founders and investors — to analyze the case for using data in the real world and extract their mental models (“the WHY and the HOW”) behind their pursuits. Hopefully, these conversations can serve as valuable tools for early-stage data professionals as they navigate their own careers in the exciting data universe.
Datacast is produced and edited by James Le. For inquiries about sponsoring the podcast, email khanhle.1013@gmail.com.
Subscribe by searching for Datacast wherever you get podcasts, or click one of the links below:
If you’re new, see the podcast homepage for the most recent episodes to listen to, or browse the full guest list.