Solving a Computer Vision task with AI assistance

Ready to transform your data strategy with cutting-edge solutions?
Hello There,
This is December, 2024. As we inch closer towards the end of the year, one aspect that's transformed my world as a Problem Solver is AI (specifically Generative AI).
What do I mean by transform?
When I say transform, I truly mean it from a "revolutionary" sense. I've been a Data Practitioner solving small, big, complex problems for companies wanting to leverage data from decision making to strategic advantage. However, given the vastness of the world of Data, it is almost impossible for a problem solver to learn every tool out there at depth that could help him solve the problem better.
For example, I've been a BI professional for most of my career. So, my expertise mostly lied with Analytics and Visualization. Machine Learning was always enticing for me but I seldom got a chance to fiddle around with it and see where and how it could solve a problem for me. My demanding work schedule never really gave me space to learn ML or rather AI properly.
However, here's where the "transformation" came in with the advent of GenAI (read ChatGPT, that I use the most).
I really don't need to wait to learn everything before attempting my first problem. I can start using AI to solve a problem just like I'd seek help from a knowledgeable human who'd be ready to help instantly (almost impossible to find)
Let me show you a practical example of how I learnt about the foundations of Computer Vision ( a completely new topic for me) as a result of solving a simple daily life problem.
One of my bright students created a gif of his dashboard work done in Power BI. Great job done!
However, I thought why not extract few key moments (read frames) from his work (read GIF file) and put it together as a swipe-like PDF and put up on LinkedIn to promote.
Idea was nice, but the problem. How do I go from a GIF to only those "key moment" snaps that are just enough to send the right message?
Here's the collection of images look like when I first extracted all images of the GIF file into a folder. This was an easy task via Python. I asked chatGPT to help me with the code to achieve this task :
This would be a simple task for an in-practice Python programmer. Me losing touch a bit, needed chatGPT's support. However, the code was straight forward :
Here's how the downloaded and unzipped folder looks like :
Now, I suddenly realized, This is nothing but collection of images indexed in time. Ex. the keyframe0_png is the first image over time and keyframe_47.png is the last one.
Man! This is an image time series. (Read video).
So, my next question : How do I visually solve this?
Let's design an algorithm :
Step 1 : Start comparing every image and it's next one starting from left to right, top to down.
Step 2 : Note the pair wherever your intuition says : "Hey, they look different!". For Example : Look at frames 12 and 13. Or, frames 30, 31. They look distinctively different.
Step 3 : Remove the rest of the frames from the collection.
What's left are the keyframes :)
Now, I thought, Let's get back to AI and check how it can help.
This is what I asked ChatGPT :
I got this as outcome :
Next, I thought, What if I read this in an Excel sheet and visualize these numbers as time series chart (line chart). So, first I asked it to download this data in a required format :
It gave me this :
Nice. Now, This made me think. How do I extract key frames?
Answer : Pick those entries from this excel where there is a "significant" bump or drop in the similarity score value. For example, as we've seen before see the similarity score values for the pair (keyframe_11.png, keyframe_12.png) = 0.99973 and that of (keyframe_12.png, keyframe_13.png) = 0.36505. The drop from the value 0.99973 to 0.36505 means only one thing, This is where a significantly different images were found.
Nice, detection tool :) Let's now "see" this for ourselves :
See all the "blips" like a change in the signal in an ECG? Those are the places image similarities changed. Hence, these images would be selected as keyframes.
However, let's take a pause momentarily and notice what's happening in images 27,28,29,30,31. The blips are small which means differences aren't that big enough. Let's manually check the images :
The change from frame 27 to 28 is significant, the same from 30 to 31. However, between 28,29 and 30 there is a minor change. Is that important? What if the differences are even lesser important?
This is where we define a tolerance limit. Ex. if I set a tolerance limit of similarity score of 0.7 then the images that are 70% similar and above are considered same and only one of them is kept together. (Remember, lower the similarity score value, more noticeable the difference in the images)
Considering a 0.7 tolerance limit the final list of keyframes were :
Frames 0, 13, 27, 28, 31, 35, 37 and 43.
And then I did a bit more tinkering and created a collage and that's it!
Got my problem solved. The best thing is : Since a computer can solve this, all I need to do is to somehow stitch the different steps as a Macro and save it so that the same set of steps can be performed at scale for even bigger GIF or even video formats.
But hold on.
This is the deciding point where AI would be a boon or bane for you.
If you just go home happy that AI did your job, you are slowly going to become embarrassingly dependent on it.
This is the time to be curious about asking it how it did what it did?
I was bubbling with questions which I used as prompts back to ChatGPT like :
" Wait a moment. You did something amazing. You could actually "see" these images and even compare them together! A machine being able to do something that we humans do at ease, is very fascinating to me. Please teach me how were you able to see an image? What does it even mean a machine being able to see an image? "
And this is where 95% of humans drop out. They let the triumph of the results dominate the curiosity they may have to dig deeper. Curiosity not only to "know" what's happening behind that magical tool but also to "appreciate" those humans who actually ended up building algorithms that could make machines do such incredible things.
Here's a glimpse where my learning journey in Computer Vision commenced.
Now, I had the right motivation to learn. Now, I would really like to delve deeper into the "theory" and reason out why something works.
Now, my real learning started not instead of AI but with assistance of AI
Would like to know how did you leverage AI to solve a problem and through that started your learning journey into something unknown but super interesting.
Ready to Experience the Future of Data?
You Might Also Like

This is the first in a five-part series detailing my experience implementing advanced data engineering solutions with Databricks on Google Cloud Platform. The series covers schema evolution, incremental loading, and orchestration of a robust ELT pipeline.

Discover the 7 major stages of the data engineering lifecycle, from data collection to storage and analysis. Learn the key processes, tools, and best practices that ensure a seamless and efficient data flow, supporting scalable and reliable data systems.

This blog is troubleshooting adventure which navigates networking quirks, uncovers why cluster couldn’t reach PyPI, and find the real fix—without starting from scratch.

Explore query scanning can be optimized from 9.78 MB down to just 3.95 MB using table partitioning. And how to use partitioning, how to decide the right strategy, and the impact it can have on performance and costs.

Dive deeper into query design, optimization techniques, and practical takeaways for BigQuery users.

Wondering when to use a stored procedure vs. a function in SQL? This blog simplifies the differences and helps you choose the right tool for efficient database management and optimized queries.

This blog talks about the Power Law statistical distribution and how it explains content virality

Discover how BigQuery Omni and BigLake break down data silos, enabling seamless multi-cloud analytics and cost-efficient insights without data movement.

This blog explains how Apache Airflow orchestrates tasks like a conductor leading an orchestra, ensuring smooth and efficient workflow management. Using a fun Romeo and Juliet analogy, it shows how Airflow handles timing, dependencies, and errors.

The blog underscores how snapshots and Point-in-Time Restore (PITR) are essential for data protection, offering a universal, cost-effective solution with applications in disaster recovery, testing, and compliance.

The blog contains the journey of ChatGPT, and what are the limitations of ChatGPT, due to which Langchain came into the picture to overcome the limitations and help us to create applications that can solve our real-time queries

This blog simplifies the complex world of data management by exploring two pivotal concepts: Data Lakes and Data Warehouses.

An account of experience gained by Enqurious team as a result of guiding our key clients in achieving a 100% success rate at certifications

demystifying the concepts of IaaS, PaaS, and SaaS with Microsoft Azure examples

Discover how Azure Data Factory serves as the ultimate tool for data professionals, simplifying and automating data processes

Revolutionizing e-commerce with Azure Cosmos DB, enhancing data management, personalizing recommendations, real-time responsiveness, and gaining valuable insights.

Highlights the benefits and applications of various NoSQL database types, illustrating how they have revolutionized data management for modern businesses.

This blog delves into the capabilities of Calendar Events Automation using App Script.

Dive into the fundamental concepts and phases of ETL, learning how to extract valuable data, transform it into actionable insights, and load it seamlessly into your systems.

An easy to follow guide prepared based on our experience with upskilling thousands of learners in Data Literacy

Teaching a Robot to Recognize Pastries with Neural Networks and artificial intelligence (AI)

Streamlining Storage Management for E-commerce Business by exploring Flat vs. Hierarchical Systems

Figuring out how Cloud help reduce the Total Cost of Ownership of the IT infrastructure

Understand the circumstances which force organizations to start thinking about migration their business to cloud