Making Cloud Computing Accessible: The Key to Real Open AI
Artificial Intelligence (AI) is poised to revolutionize various facets of our lives. However, to truly democratize AI, we must first address a fundamental issue: access to cloud computing. Open-sourcing large language models (LLMs) is a significant step forward, as it allows everyone to access advanced models. Yet, the reality is that most of the computing power needed to utilize these models effectively is owned by large corporations, not individuals.
Prominent figures like Elon Musk have been strong advocates for open-sourcing AI models. While this is a positive move, the average person still struggles to leverage these open-source models due to a lack of knowledge and resources for accessing cloud computing. Without the necessary computational power, building anything substantial with these models remains out of reach for most people.
In a previous post, Build Your Own LLMs App in Your Laptop, I explained how to install Ollama and run LLMs in a local environment. This guide aims to empower non-technical users to engage with AI freely, which is a significant step forward. However, running model inference locally requires much less computational power than training a large language model. This distinction is crucial because training or even fine-tuning an existing model demands substantial resources.
Medium to small businesses, not just individuals, often lack the local resources to train or fine-tune large-scale AI systems. This limitation underscores the need for accessible cloud computing solutions. If we aim to achieve true “open AI,” we must lower the threshold for accessing cloud computing, making it more attainable for everyone, not just organizations.
Cloud providers like AWS, Azure, and GCP offer web-based consoles that are relatively user-friendly for developers, but they often come with a steep learning curve for newcomers. At Hazl, we’ve aimed to make cloud computing more intuitive for a broader audience, including those without cloud engineering knowledge. Our platform enables quick provisioning of cloud resources and automated application deployment with just a click.
As an entrepreneur, AI practitioner, and most proudly, an open-source community supporter, I believe the future of AI should blend proprietary and open-source models. Just as Linux offers an alternative in the operating system space, people should have the right to own and build upon the models they find most suitable for their needs. I strongly believe that the key to achieving this is first making cloud computing easily accessible to everyone.