Artificial intelligence (AI) has been a hot topic of discussion for many years now, and for good reason. AI has the potential to revolutionize many industries and aspects of our lives. Recently, the pace of AI innovation has accelerated sharply and easy to use AI tools have started to become readily available to the broader population. Recent breakthroughs have the potential to impact workflows across industries and may have significant impacts on investment portfolios.

AI – What Is It:

There are a lot of different kinds of AI tools being developed as our skill at building new machine learning models increases. In particular, deep learning and large language models have made rapid advancements. Deep learning is a type of machine learning that uses artificial neural networks to learn from data. Neural networks are inspired by the human brain, and they are able to learn complex patterns from data that would be difficult or impossible for traditional machine learning algorithms to learn.

Deep learning has been used to achieve state-of-the-art results in a variety of tasks, including image recognition, speech recognition, and natural language processing.

A similar process is used in the development of large language models (LLMs). These models are able to generate text, translate languages, write different kinds of creative content, and answer your questions in an informative way. These models have recently burst onto the scene as they are easy to work with and highly interactive. They have many uses for day to day productivity, and show promise in assisting with a broad variety of content generation ranging from helping write code to generating articles or other creative content.

AI Constraints:

There are two primary inputs that act as limitations on the development of new AI models; data and processing power. Both resources are crucial. Data without processing power won’t yield a very accurate model, and processing power with low quality data will yield an AI tool full of biases or inaccuracies.

Processing power is the more straightforward of the two constraints. Bigger data sets or more complex AI set ups require more processing power, which means more chips and more electricity. Not all chips are made equal though – AI computing is substantially more efficient when done on graphics cards like those made by Nvidia, rather than on standard computational processors like those that currently dominate most servers across the world. Nvidia’s recent sharp stock price rise is almost entirely due to their raising guidance around expected graphics card revenues for use in AI computing. While Nvidia is currently the top player in the space, Microsoft has committed billions of dollars to help AMD design better chips for AI, and several other companies like Apple, Meta, and Alphabet all have internal chip design programs that will likely focus on AI chips as demand in that sector grows.

Data is the trickier constraint that AI models face. While processing power is primarily a question of cost versus quality, data set availability is more nuanced. The internet as a whole is a great database, but it’s very generalized and the quality is often questionable. More specialized data sets may be harder to access or may face privacy concerns. Furthermore, there are broad concerns around how data is presented when training an AI tool – will the data or the training parameters end up with the AI having a bias or tilt or blind spot. For instance, think about the difference in political perspective if the AI was sampling news from Fox News verses MSNBC. Similarly, what if it’s an automated driving AI and it’s trained without sufficient data for severe weather – the output will only be as good as the data input.

Read the full article in Forbes.


Want to reevaluate your wealth management strategy in 2o23? Contact the nationwide advising team at IHT Wealth Management today!