In this talk - I will discuss two of my projects (and some others) that can be used to build alternative AI systems -
concepts models help in building faster and small local inference.
Edge AI on device (Raspberry Pi for demonstration) - will sow how you can run tflite models on edge.
Both these are meant to bring AI capability from BigTech to your own GPUs and devices.
My slides:
https://docs.google.com/presentation/d/1r_KgnJIg7Z1RzV09rdfVYq5QIJ4Aem1L6jJZtafaQLI/edit?usp=sharing
My code that I will demonstrate (I wwill bring my raspberry pi and show how the sentiment analysiss works on a device through local small models):
https://github.com/virajsharma2000/v-embedded-ai-rpi
https://github.com/virajsharma2000/v-lcm-demo
I will also talk about my AI journey as a young coder and some future tech I am working on.
I have given this speech very recently at a university in delhi - please go through my webpage for this one and other speeches I have given.
Thanks
What are some alternative open source AI models and architectures that can run on local GPUs
How can we use Non Bigtech (OpenAI, Gemini) etc through open source models on own GPUs and devices