AI-AIoT

10 Frequently Asked Questions about Artificial Intelligence

by GIGABYTE
Artificial intelligence. The world is abuzz with its name, yet how much do you know about this exciting new trend that is reshaping our world and history? Fret not, friends; GIGABYTE Technology has got you covered. Here is what you need to know about the ins and outs of AI, presented in 10 bite-sized Q and A’s that are fast to read and easy to digest!
1. What is artificial intelligence (AI)?
Artificial intelligence is a branch of computer science, first proposed in the 1950s, that is concerned with building machines that exhibit more human-like behavior, like independent decision-making and critical thinking. The reason is that such a “smart” machine will be able to solve more problems, and it will be easier for humans to work with.

Over the decades, many different methods have been utilized to develop AI—to varying degrees of success. Since the 2010s, AI has advanced by leaps and bounds thanks to revolutionary new techniques like machine learning and deep learning. The current state of AI is so cutting-edge that it can recognize visual patterns through computer vision, converse with humans through natural language processing (NLP), and create content ranging from photorealistic images to sightseeing itineraries with generative AI. Industry leaders predict that AI will soon be able to shoulder more tasks that were traditionally considered to be “science fiction”, such as piloting autonomous, self-driving vehicles.

Learn More:
《Power of AI: How to Benefit from AI in the Automotive & Transportation Industry
《Case Study: Constructing the Brain of a Self-Driving Car with GIGABYTE
2. What are the different types of AI?
Because AI is the umbrella term for a range of technological breakthroughs, it is easy to confuse the AI that plays chess on the computer with the AI that can generate content or drive cars. To make it easier, we can differentiate AI by “capability” or “functionality”.

The three types of AI based on “capability” are artificial narrow intelligence (ANI, also known as “applied AI” or “weak AI”), artificial general intelligence (AGI, also known as “full AI” or “strong AI”), and artificial superintelligence (“ASI” or “superintelligence”). All current forms of AI are ANI—that is, they are designed to perform very specific tasks, and they cannot acquire new skills. AGI is on par with humans in that they can be taught new tasks; while this is still in the future, the groundwork has been laid with inventions like generative AI. ASI is the stuff of science fiction, such as what is written by Isaac Asimov and Arthur C. Clarke, where AI surpasses humans in intelligence.

By “functionality”, AI can be separated into “reactive machines”, “limited memory”, “theory of mind”, and “self-aware”. The most famous “reactive machines” are the supercomputers that’ve beaten human grandmasters at chess or Go: they can memorize rules and react to stimuli, but they cannot retain memory, and so they do not improve through “practice”. The recommendation engines used by media streaming platforms and the spam filters used by email servers are also “reactive machines”. “Limited memory” is more advanced because it can learn from historical data. The AI “learns” through AI training by analyzing a massive amount of big data and adjusting the parameters of its algorithm; during AI inference, where the AI interacts with new information, it draws upon its training to make the correct decisions. Most modern forms of AI, from generative AI to autonomous vehicles, are “limited memory” AI.

Both “theory of mind” and “self-aware” AI are still theoretical concepts. “Theory of mind” AI understands human emotions and can respond in like manner, because that’s also a key part of human communication. “Self-aware” AI understands not only its human users, but also itself as an artificial construct. Such an AI would develop its own “emotions” and “goals”, which would make it truly beyond human control.
To better understand AI, we can separate them by “capability” and “functionality”. Currently, even the most advanced forms of AI are still a far cry from the kind of AI depicted in science fiction.
3. How does AI work?
At its core, an AI is a set of computer algorithms coupled with a massive dataset that allows it to interpret new data, understand which algorithm it needs to run, process the data, and deliver results. AI development in recent years has been expedited by a subset of machine learning called deep learning, which allows the AI to learn through experience and modify its behavior with little to no human intervention.

The way that the AI engages in learning is called AI training. During training, an enormous dataset with billions, or even trillions of parameters is fed into an artificial neural network (ANN), which is structured like a human brain. As the datum passes through each layer of the network, the algorithm checks the validity of the output and assigns weighted scores, or “biases”, to the data parameters. Through repeated iterations of predictions (forward propagations) and feedback (backward propagations), the weightings become so precise that the right path through the network will always be chosen. To put it simply, the AI has practiced guessing what the right response is to different inputs so many times that it will always make the correct guess when presented with new data.

During AI inference, the AI model interacts with fresh, unlabeled data in the real world. It relies on the “memory” of its training to generate the right output. Whether it guesses correctly or not, the results can be stored for the next round of AI training, which further improves the AI.

Learn More:
《Introducing GIGABYTE AI Servers used to develop artificial intelligence》
《Tech Guide: To Harness Generative AI, You Must Learn About “Training” & “Inference”
4. How can I benefit from AI?
The fact of the matter is, you already are. When you type your question into a search engine and it produces the results you want; when you drive on the highway and the electronic tolling system reads your license plate without requiring you to slow down; when you ask ChatGPT to write an email for you—that’s the benefits of modern AI.

Going forward, AI is certain to bring meaningful change to a wide range of industries, leading to a complete transformation of our day-to-day lives. To list just a few examples, the Advanced Driver Assistance Systems (ADAS) in our cars will progress through the different stages of sophistication until they achieve true autonomy; doctors will be able to spot signs of disease more quickly and precisely using AI-assisted medical imaging; farmers will be able to protect their crops by using AI to make accurate weather predictions, or even spot the symptoms of dangerous infections via satellite imagery.

It goes without saying that you can benefit from AI even further if you take a proactive role. After all, modern AI applications are so new and game-changing, experts in different sectors are discovering innovative ways to benefit from them every day. Your imagination is the limit to how you can reap the rewards of AI.

Learn More:
《Case Study: Spain’s IFISC Protect Precious Olive Groves with GIGABYTE Servers
《Case Study: GIGABYTE PILOT Puts Taiwan’s First Self-Driving Bus on the Road
5. What kind of people work with AI?
In a sense, we are all AI end users. Whether you are a student or have a job, whether you work in manufacturing or service, whether you drive an 18-wheeler or practice delicate brain surgery, AI has become a part of your bread and butter in some shape or form.

If we’re talking about the people who develop and provide AI services, that would be an entire supply chain beginning with hardware manufacturers upstream, AI model developers, model hubs, and MLOps services midstream, and cloud service providers (CSPs) offering dedicated applications and services downstream. This is a vast field full of different disciplines and populated by all sorts of professionals. In fact, one of the only unifying threads is that all these experts use the same tools—servers, which we will discuss in in the next section.

It should be noted that academic and research institutes also play an important role in the supply chain. The academia is always pushing the envelope of AI development, whether it is creating innovative services that can be spun off into profitable businesses, or holding supercomputing competitions where participants break world records with existing tools. It should come as no surprise that these researchers primarily work with servers, as well.

Learn More:
《Case Study: Rey Juan Carlos University Delves into Cellular Aging Mechanisms with AI
《Case Study: GIGABYTE Helps NCKU Supercomputing Team Break NLP World Records
Here are some examples of the key players on the AI supply chain. GIGABYTE provides not only server solutions but also MLOps services through its investee company MyelinTek.
6. What kind of computers are used for AI?
AI runs on a special kind of enterprise-grade computer called a server. Personal computers and mobile devices can also run AI, but their capabilities are severely limited by the consumer-grade hardware. This is why most AI services require an internet connection, so that your PC or smartphone can connect to the computing resources on the cloud.

The main advantage of using servers is that specialized roles can be assigned to different servers built specifically to fulfill such functions. For example, some servers might be designed for high-performance computing (HPC), while others may excel at storage. By placing a network of specialized servers at the users’ disposal, a multitude of efficient, effective, and reliable AI services can be made available.

Servers are usually kept in data centers or server farms to ensure uninterrupted operations and high availability. CSPs like Amazon Web Service (AWS), Google Cloud Platform (GCP), and Microsoft Azure may construct massive data centers to host thousands of servers. Smaller server rooms may be built inside private organizations to host internal AI services. Some powerful desktop computers, which are in reality a class of servers known as workstations, can also handle moderate amounts of AI workloads.

Learn More:
《Purchase GIGABYTE HPC Servers to enjoy the benefits of high-performance computing》
《Choose GIGABYTE Workstations to deploy enterprise-grade computing on your desktop》
《Tech Guide: What is a Server?
7. What is an AI server?
The bespoke quality of servers has led to the creation of specialized AI servers. GIGABYTE has a dedicated webpage on the topic of AI servers and other products. Briefly, an AI server is different from a regular server in the following ways:

Since both AI training and AI inference are very compute-intensive, powerful processors are added to the server to turn it into a supercomputing platform. In general, there are two types of processors in a server—the central processing unit (CPU) and graphics processing unit (GPU). An AI server is likely to include the latest iterations of both kinds of chips.

In the case of the CPU, topline x86 chips made by AMD or Intel will be chosen if the user wants to work with the CISC-based ecosystem, or ARM chips if the user prefers the cloud-native characteristic of RISC-based products. On the other hand, GPUs are accelerators that can help the CPU compute even faster. Depending on the nature of the AI workload, the user may choose different GPU modules or expansion cards. For example, GIGABYTE G593-SD0G593-SD2, and G593-ZD2 are three servers from GIGABYTE that are integrated with NVIDIA HGX™ H100 8-GPU, one of the most powerful modules for AI training on the planet. GIGABYTE G293-Z43 houses a highly dense configuration of sixteen AMD Alveo™ V70 cards, which makes it an optimized platform for AI inference.

Other components in an AI server, such as the memory, storage, and PSU, are also likely to employ the latest breakthroughs to make sure the server can efficiently process vast quantities of data and perform sophisticated AI computations.

Learn More:
《More information about GIGABYTE Servers powered by AMD CPUs
《More information about GIGABYTE Servers powered by Intel CPUs
《More information about GIGABYTE’s ARM Servers
Simply put, an AI server uses state-of-the-art processors and other components to support AI development and application with the most powerful computing performance.
8. What is GIGABYTE Technology’s role in advancing AI?
GIGABYTE Technology is a world-renowned industry leader in AI and HPC servers, server motherboards, and workstations. Its role in advancing AI encompasses both hardware and software. On the server side, GIGABYTE not only has a specialized line of AI servers, but also a comprehensive portfolio of different server series that fulfill various roles in a data center. For example, there’s the G-Series GPU Servers for GPU-accelerated heterogeneous computingS-Series Storage Servers for data storage, R-Series Rack Servers for versatile applications, and H-Series High Density Servers for multi-node computing in a compact chassis.

GIGABYTE works closely with chip manufacturers farther up the supply chain to ensure that our server solutions are equipped with the latest CPUs and GPUs from AMD, Intel, NVIDIA, and other suppliers. This close relationship with other industry heavyweights guarantees that when you buy GIGABYTE, you are getting the most advanced and certified computing platforms for AI. For instance, GIGABYTE G593-SD0 GPU Server was the first SXM5 server on the market to receive NVIDIA certification. GIGABYTE also took the lead in producing servers powered by ARM CPUs for users that work with a lot of data from edge devices—such as the “high-precision traffic flow model” developed by Taiwan University (NTU) to simulate road conditions in the lab for the testing of autonomous vehicles.

In addition to core server components, GIGABYTE also incorporates cutting-edge inventions that will help process AI workloads while providing other benefits. GIGABYTE has a complete line of liquid-cooled and immersion-cooled server solutions that use liquid coolant instead of air to dissipate the heat generated by processors. These innovative thermal management systems can help the processors achieve maximum TDP while enhancing the data center’s PUE, effectively reducing carbon emissions. Other value-added services include GIGABYTE Management Console (GMC) and GIGABYTE Server Management (GSM), which are remote management software tools provided free of charge when you purchase GIGABYTE servers.

On the software side, GIGABYTE’s investee company MyelinTek provides the DNN Training Appliance for MLOps. This package offers AI developers an ideal environment to manage datasets and engage in AI model analysis.
9. Who are some of the major developers of AI?
Since much of AI development is dependent on advancements in computing capabilities, many of the aforementioned chip developers, such as AMD, Intel, and NVIDIA, are considered important pioneers in AI tech. Server brands like GIGABYTE Technology help to bring their latest products to users who are engaged in AI development and application. For example, Rey Juan Carlos University in Madrid relies on a computing cluster made up of GIGABYTE servers to leverage state-of-the-art AMD EPYC™ and Intel® Xeon® Scalable CPUs, as well as NVIDIA HGX™ GPU computing modules, in their research into cellular aging.

Global tech giants like Alphabet, Amazon, Apple, IBM, Meta, and Microsoft are also leaders in the field of AI. They provide hardware and software platforms for other AI developers to use, and they develop new AI applications and services for the end users who utilize their products.

Other companies are trailblazers of AI for specific applications. To list just a few examples, OpenAI is renowned for their generative AI models, as well as the AI tools which are built on those models, such as ChatGPT. Mobileye is an Israeli developer of ADAS and autonomous driving technologies that is leading the charge to put self-driving cars on the road.
10. What is the future of AI?
Long before AI becomes “self-aware” or surpass human intelligence, it will be used to benefit every aspect of our day-to-day lives, from how we work to how we travel, from how we play to how we take care of our health. There are two industry indicators that you can think of as the bellwethers of AI progression.

● Computing platforms: Advancements in processing power and server hardware will help to nurture more powerful AI. Therefore, look for jumps in AI development when hardware manufacturers introduce new computing products with the ability to calculate a larger amount of data even faster. A good tech brand to keep an eye on is GIGABYTE Technology! One, because we are constantly launching new AI servers that are outfitted with the latest breakthroughs in hardware technology that will expedite AI development; and two, because we frequently publish new content on our official website delving into the latest AI-related success stories and in-depth analyses.

● Software development: On the back of new computing hardware, new AI models and applications will be invented for use in diverse vertical markets. Therefore, you can expect to see new AI-assisted services being made available shortly after the announcement of breakthroughs in AI software.

Thank you for reading “10 Frequently Asked Questions about Artificial Intelligence” by GIGABYTE. For further consultation on how you can benefit from AI in your business strategy, academic research, or public policy, we welcome you to reach out to our sales representatives at [email protected].

Learn More:
《Tech Guide: Server Processors, the Core of a Server’s Performance
《Tech Guide: How to Pick a Cooling Solution for Your Servers?
Get the inside scoop on the latest tech trends, subscribe today!
Get Updates
Get the inside scoop on the latest tech trends, subscribe today!
Get Updates