ADLINK has introduced its pocket-sized & portable NVIDIA RTX A500 GPU which replaces the iGPU on your system for AI speedup.
ADLINK's Pocket AI is Targeted Towards Having Graphics Computing In Your Hands With The Power of NVIDIA's RTX GPUs
Imagine carrying a GPU with AI-boosting performance in your pocket. While this may sound ridiculous, ADLINK has done it with its portable Pocket AI GPU featuring an NVIDIA GPU with Tensor-Core technology.
The concept behind this technology is interesting; it aims at providing the power of AI & graphics computing capabilities to any device in a way similar to "plug-and-play." The ADLINK Pocket AI is specifically designed for professionals with AI tasks requiring a dedicated GPU. To highlight the capabilities of this innovative tech, StorageReview has uploaded an extensive review covering every aspect in detail.
Starting with the onboard GPU, the Pocket AI features the NVIDIA RTX A500 GPU, which features 4 GB GDDR6 memory along with 2048 CUDA cores and 64 Tensor cores running at a 64-bit interface. It comes with 6.54 TFLOPS peak FP32 computing power with a TDP of 25W. On paper, the GPU has lower specs than the RTX 3050 but if you are running an old machine with an iGPU, it may offer a nice upgrade, especially if you want to focus on AI-intensive tasks.
Regarding the design details, the Pocket AI is equipped with rubber sleeves that help carry the GPU and protect users against potential burns since the NVIDIA RTX A500 runs pretty hot. The unit features a black and subtle green color scheme with two ports onboard. The first is the Power Delivery one, while the second is Thunderbolt 3, which connects the GPU to your device. The cooling mechanism on the Pocket AI is simple, featuring an intake fan on the bottom with ample cutouts to maintain the airflow cycle.
It all comes down to how Pocket AI performs in real-life applications. StorageReview benchmarked the GPU at Luxmark & UL Labs Procyon AI benchmark, and its performance figures show over 2x the performance of an iGPU. Here is a look at the figures obtained below:
Benchmark | Iris XE 96 core | ADLINK Pocket AI |
A5000 Laptop GPU
|
Luxmark Hall | 1828 | 3979 | 14,226 |
Luxmark Food | 869 | 1837 | 5,499 |
Procyon AI Inference | 59 | 264 | 651 |
The benchmarks reveal that while the Pocket AI doesn't hold immense raw power, it is suitable for workloads within the AI industry. The product is ideal for situations where continuing with an existing device is preferred over upgrading to a new one. If you want to add enhanced graphics computing capabilities to your device, Pocket AI is the right way. However, the ability to have the power of Pocket AI with portability comes with a cost.
The Pocket AI is currently priced at USD $429, which is an enormous price tag, judging by onboard components. To give you an idea, the NVIDIA RTX 4060 Ti is currently retailing for a similar price, hence considering this, I believe that the Pocket AI isn't priced right here. But that's the case with almost all external and modular GPU options. The GPD G1 which comes with an AMD RX 7600M XT GPU costs around $800 US which is around the same price as the RX 7900 XT enthusiast-grade card.
Moreover, while with the Pocket AI, you may gain portability and computing power, in the grand scheme of things, most consumers won't get into the trouble of carrying the GPU daily; rather, they will prefer a permanent solution which means getting a better device. Furthermore, concerns with the temperatures of Pocket AI were also reported, which are evident to occur in such devices.
Overall, Pocket AI seems like an innovation that comes with a lot of problems behind. Although the direction is right here, ADLINK's Pocket AI can't be a success unless it comes at better pricing along with the improvements within the thermal dissipation department.
from Wccftech https://ift.tt/7YRoZjg
0 Comments