Bamgladesh offerrs best intelligence wwe have seen for sdgs 5 through 1 up to 2008, Search eg 4 1 oldest edu 4.6 newest edu ; .620th century intelligence - ending poverty of half world without electricity -although Keynes 1936 (last capter general theiry money inetrest emplymen) asked Economists to take hipocrati oath as the profession that ended extreme poverty, most economists did the opposite. Whats not understandable is how educatirs failed to catalogue the lessons of the handful who bottom-up empowered vilages to collaboratively end poverty. There are mainly 2 inteligences to understand- Borlaug on food; fazle abed on everything that raised life expectancy in tropical viage asia from low 40s to 60s (about 7 below norm of living with electricity and telecomes). Between 1972 and 2001, Abed's lessons catalogued in this mooc had largelu built the nation of Bangladesh and been replicated with help of Unicef's James Grant acroo most tropical asian areas. What's exciting is the valley's mr ad mrs steve jobs invted Fazle Abed to share inteligences 2001 at his 65th birthday party. The Jobs and frineds promised to integrate abed's inteligence into neighborhod university stanfrd which in any event wanted Jobs next great leap the iphone. The Valley told abed to start a university so that women graduates from poor and rich nations could blend inteligence as Abed's bottom of the pyramid vilage began their journey of leapfrog modles now that gridd infarstructures were ni longer needed for sdiar and mobile. Abed could also help redesign the millennium goals which were being greenwashed into a shared worldwide system coding frame by 2016. There re at Abed's 80th birtday party , the easy bitwas checking this mooc was uptodate. The hard bit - what did Abed mean by his wish to headhunt a taiwanese american to head the university's 3rd decade starting 2020?

Saturday, December 31, 2011

Tracking AI Startups 2025 - 2011

 With AI being expected to be the soul of the world's largest companies, the idea that eg Nvidia's CEO Jensen Huang designs his conpany to track (even friend) 18000 start ups may sound weir. In truth the ai ecosystem needs (probably more) seeds than ay other ecosyem - brain tools, channels to specific contxts, and in some cases eg climate ai and life scoeces ai is effectively the future of most of that market's innovation


take 2024 top 50 strtup compiled by forbes and originally tracked with Seqoia - of 50 startups early 2024 https://www.forbes.com/lists/ai50/?sh=3585c09f290f less than 10 claim fundinf over hald a billion - of course its true that by time a startup is nearing unicirn status it may well end strtup life beung ipo'd or aquired

but here are the mainly american big fish in AI start up pool of early 2024 


OPEN AI - one off case a musk funded n go that turned profit -and became llm chat gpt4 producer- seems to have too many partners to be ipo'd unless microsoft buys it -  valued at over 11 billion (when it comes to llms its an unique valuation league - it is said that while the first llm cost about 1000 $ , to launch a new one as a potential world leadwr would cost over 100 billion; of course llms leverage really big computing so the bigget digital companies may well see company and main llm arcitectire as insperable


Adunil Ai Defence 2.8 bn

Anthropic 7.2 bn - another llm with an odd stoiry- probably forst funded by nft notoriety banker-freeman; timely enough toi build an llm but seems to have found mixed partner to leverage computing caacity with eg amazon

cerebras 720 million chips manufacturer

Databricks -stata strage and analytics  4 bn dollar (get this field right and you emerge with data warehousing's Snowflake

Here is some commentary from Sequoia 2023 which clarifies:

When we launched the AI 50 almost five years ago, I wrote, “Although artificial general intelligence (AGI)… gets a lot of attention in film, that field is a long way off.” Today, that sci-fi future feels much closer.

The biggest change has been the rise of generative AI, and particularly the use of transformers (a type of neural network) for everything from text and image generation to protein folding and computational chemistry. Generative AI was in the background on last year’s list but in the foreground now.

The History of Generative AI

Generative AI, which refers to AI that creates an output on demand, is not new. The famous ELIZA chatbot in the 1960s enabled users to type in questions for a simulated therapist, but the chatbot’s seemingly novel answers were actually based on a rules-based lookup table. A major leap was Google researcher Ian Goodfellow’s generative adversarial networks (GANs) from 2014 that generated plausible low resolution images by pitting two networks against each other in a zero sum game. Over the coming years the blurry faces became more photorealistic but GANs remained difficult to train and scale.

In 2017, another group at Google released the famous Transformers paper, “Attention Is All You Need,” to improve the performance of text translation. In this case, attention refers to mechanisms that provide context based on the position of words in text, which vary from language to language. The researchers observed that the best performing models all have these attention mechanisms, and proposed to do away with other means of gleaning patterns from text in favor of attention.

The eventual implications for both performance and training efficiency turned out to be huge. Instead of processing a string of text word by word, as previous natural language methods had, transformers can analyze an entire string all at once. This allows transformer models to be trained in parallel, making much larger models viable, such as the generative pretrained transformers, the GPTs, that now power ChatGPT, GitHub Copilot and Microsoft’s newly revived Bing. These models were trained on very large collections of human language, and are known as Large Language Models (LLMs). 

Although transformers are effective for computer vision applications, another method called latent (or stable) diffusion now produces some of the most stunning high-resolution images through products from startups Stability and Midjourney. These diffusion models marry the best elements of GANs and transformers. The smaller size and open source availability of some of these models has made them a fount of innovation for people who want to experiment.

As does this visual on the top 50 at 2023


our trends in this year’s list

Generative AI Infrastructure: OpenAI made a big splash last year with the launch of ChatGPT and again this year with the launch of GPT-4, but their big bet on scale and a technique called Reinforcement Learning with Human Feedback (RLHF) is only one of many directions LLMs are taking. Anthropic and their chatbot Claude use a different approach called Reinforcement Learning Constitutional AI (RL-CAI). The CAI part encodes a set of human-friendly principles designed to limit abuse and hallucination in the outputs. Meanwhile Inflection, a secretive startup founded by DeepMind’s Mustafa Suleyman and Greylock’s Reid Hoffman, is focusing on consumer applications.

No comments:

Post a Comment