Items tagged with AI

As artificial intelligence and machine learning capabilities continue to play increasingly larger roles in everyday tasks, there is a need for ever-faster hardware and architectures. Arm is on the task. During its DevSummit conference this week, Arm's senior director of technology, Ian Bratt, talked a bit about what's in store for the company's next-gen GPU architecture. His 20-minute keynote was largely focused on AI and ML technologies, including things like the human plasticity curve and other buzz phrases that are not likely to be widely recognized by the general public. They are, however, important for how consumer devices operate and what they are capable of doing. Think about your smart... Read more...
In 2020 a study showed the IT industry spent an estimated $2 trillion in software development associated with debugging code. The study also showed that 50 percent of IT budgets were allocated to debugging code alone. Intel hopes to change those numbers by making its ControlFlag tool open-source. ControlFlag is an AI-powered tool created by Intel to detect bugs within computer code using advanced self-supervised machine learning (ML). The software developed last year was able to locate hundreds of confirmed software defects in proprietary, production-quality software systems in just a few analyses of source code repositories. Its machine learning techniques enable it to find coding anomalies,... Read more...
There's quite a bit of moving and shaking in the chip industry right now—Apple just unveiled its M1 Max and M1 Pro chips, Google provided more details on its upcoming Tensor SoC, and of course Intel's Alder Lake launch looms right around the corner. But as it applies to the mobile space, Qualcomm has a message for the competition—Snapdragon is still a formidable architecture. To highlight the point, Qualcomm today is pointing to an updated set of MLPer results. "There’s only one way to say it: the Snapdragon 888+ 5G Mobile Platform blew away the competition in the latest MLPerf Mobile Inference v1.1 benchmark submissions," Qualcomm begins its blog post on the results. "In one... Read more...
Facebook's artificial intelligence division has announced a new long-term research project called Ego4D, which is an ambitious effort to expand the boundaries and capabilities of AI to understand and learn our habits from a first-person perspective, via AR glasses and headsets. The hardware aspect, while critical, is secondary to the scope of what Facebook is trying to accomplish. At face value, this is about teaching AI to perceive and the world through your own eyes, to help you with a variety of everyday and/or occasional tasks. That's where some kind of head gear comes into play—it serves as the first-person conduit from which the AI is fed external data from your surroundings, experiences,... Read more...
Back in May, Samsung unveiled a high-tech gadget for data centers called the CXL Memory Expander, essentially a memory expansion device that utilizes the newer Compute Express Link interconnect standard. As with most hardware, however, the clever device is only as good as the software. To address that fact, Samsung has now introduced what it says is the industry's first open-source software solution that is specifically designed to support the CXL memory platform. The upshot of Samsung's CXL memory expander is it allows clients to scale memory capacities and bandwidth in data centers to terabyte levels, going well beyond what was previously possible. CXL itself is an open, industry-supported... Read more...
With the proliferation of artificial intelligence in recent years, the term "neuromorphic" is being used much more often in the tech sector. If you're a native English speaker, you can probably surmise that neuromorphic means something along the lines of "brain-like." Indeed, the buzzword of the day is "neuromorphic processing," and it refers to computers—previously called "cognitive computers"—designed to mimic the function of the human brain. The reason that's the buzzword of the day is because Intel just announced its second-generation neuromorphic processor, Loihi 2. If you never heard of the original Loihi, you probably aren't involved in bleeding-edge artificial intelligence... Read more...
In 2014, it was established that the U.S. Copyright Office would "not register works produced by nature, animals, plants, or through divine or supernatural spirits." This declaration stemmed from a monkey who took a selfie and a photographer who thought they had the rights to the image. We are now facing a similar problem with artificial intelligence and patents across the pond in the United Kingdom, where a court has ruled that patents must list humans as inventors, not AI. In 2015, Stephen Thaler filed a U.S. patent for something dubbed Dabus (Device for the Autonomous Bootstrapping of Unified Sentience), an "Electro-optical device and method for identifying and inducing topological states... Read more...
Drivers contracted to make delivers for Amazon claim that special cameras installed in their delivery vans are unfairly dinging them for road violations that did not occur, resulting in loss of extra income and other bonuses that they might otherwise be eligible to receive. Amazon, meanwhile, says the cameras have led to fewer accidents and less instances of distracted driving. Amazon earlier this year partnered with Netradyne to install the camera systems into Amazon-branded delivery vans. Called "driveri," these are more than simple cameras. They're quad-camera systems with four HD lenses, including one that faces the road, another that faces the driver, and two side lenses that collectively... Read more...
It is well established that facial recognition based on machine learning is not perfect by any stretch of the imagination; therefore, using it for security purposes is likely a bad idea. This has now been proven through research from Ben-Gurion University of the Negev, which showed that digital and real makeup could trick facial recognition systems with a success rate of up to 98%. The researchers at Ben-Gurion University explained that facial recognition is widely used in subways, airports, and workplaces to automatically identify individuals. In this experiment, the ArcFace face recognition model was used with 20 blacklisted participants who would be flagged in a real-world facial... Read more...
NVIDIA CEO Jensen Huang is adding another honor to his portfolio this week, as he was named one of the world's most influential people by TIME Magazine. Of course, we in the computing industry know all about Huang and his company's contributions to AI, PC gaming, and autonomous vehicles (among other sectors), and TIME is now recognizing those achievements. Andrew Ng, the founder of DeepLearning.AI and Landing AI, penned TIME's piece on Huang, writing that "he has helped enable a revolution that allows phones to answer questions out loud, farms to spray weeds but not crops, doctors to predict the properties of new drugs—with more wonders to come. "With still emerging AI technologies creating... Read more...
Most people would be happy just to get their mitts on a single GPU, in the current landscape. The US Department of Energy's Argonne National Laboratory is much more fortunate, having procured a whopping 2,240 NVIDIA A100 Tensor Core GPUs (based on Ampere) for its Polaris supercomputer to "supercharge researcher and discovery." As such, Polaris qualifies as the largest GPU-based supercomputer on the planet, with thousands of A100s working in tandem with hundreds of second-generation and, sometime later, third-generation AMD EPYC server processors. More precisely, there are 560 nodes spread out over 40 racks, each with four A100 GPUs. As currently constructed, each node features a 32-core/64-thread... Read more...
Do you think maybe the tech industry is hyper-focused on artificial intelligence (AI) technologies? It certainly is, and to some extent AI is practically everywhere these days, from servers and high-performance computing, to autonomous vehicles and everyday consumer devices, and everything in, around, and between. So it's not really shocking that Samsung is touting its latest advancements in processing-in-memory (PIM) technology at the Hot Chips 33 conference. Some of what Samsung is discussing has to do with past announcements. For example, back in February Samsung introduced the industry's first high bandwidth memory PIM, Aqualbolt-XL, with AI processing built into its HBM2 Aquabolt to bolster... Read more...
Artificial Intelligence is a tricky business, as with anything in life, with great power comes great responsibility. On the one hand, AI can power autonomous vehicles or help usher in more secure computing platforms. On the other hand, for example, now it appears it's possible to end up being jailed due to questionable AI-based evidence. This is precisely what happened to 65-year-old Michael Williams when he was arrested last August, after being accused of killing a young man in his neighborhood who asked for a ride during a night of community unrest, due to a reported police brutality incident. In 2018, the city of Chicago entered a $33 million contract with ShotSpotter, a network of surveillance... Read more...
The human brain is enormously complex, and cracking the code of its intricacies in its entirety might never be accomplished. However, there have been loads of interesting research related to brain activity. Most recently, a team of researchers say they developed a deep learning framework that is able to decode sensory and behavioral variables from wide-band neural data. Or put another way, they came up with an AI scheme that can interpret brain signals and predict behaviors. Whoa. Interpreting brain data is no easy task, and often depends on manual operations, the researchers say. The landscape is rife with incomplete datasets. The task is further compounded by the fact that activity in any particular... Read more...
Following call center company Teleperformance allegedly forcing employees to undergo AI camera surveillance, Amazon wants to monitor its own customer service employees. Soon, Amazon could use a system that captures all workers' keystrokes to run behavioral analysis and prevent malicious hackers or imposters from stealing data. In a confidential document acquired by Motherboard, Amazon reports that there have been several cases of customer data being accessed around the world. India ranks at the top of the list, with 120 security incidents, followed by the Philippines with just under 70, and the U.S. with approximately 40 security incidents. While each of these incidents are not explained,... Read more...
If you thought Amazon wanting drivers to submit to biometric surveillance was bad, these recent revelations take the cake. Colombia-based call center workers, who perform outsourced customer services for some of the largest companies in the US, are now reportedly being pressured into signing a contract allowing their employer to install cameras in their homes to monitor work performance. Teleperformance is one of the world’s largest call center companies with nearly 400,000 employees and clientele, including Apple, Amazon, and Uber, among others. Now, in a new ground-breaking report from NBC, six workers for the Colombian company, even working on contracts for these companies, have come... Read more...
Though it has been more than a few weeks, and we have long since passed the end of 2020, the U.K’s most powerful supercomputer is now operational. Powered by NVIDIA hardware, the Cambridge-1 is a $100 million, 400 petaflop beast of a computer that ranks “among the world’s top 50 fastest computers and is powered by 100 percent renewable energy.” Under the hood, the Cambridge-1 supercomputer is powered by 80 DGX A100 systems tied together, each featuring NVIDIA A100 GPUs, BlueField-2 DPUs, and NVIDIA HDR InfiniBand networking.  The whole system is an NVIDIA DGX SuperPOD that delivers the aforementioned 400 petaflops of AI performance and eight petaflops of Linpack performance... Read more...
I have only dabbled in programming, and from my limited experience, I can appreciate that really good coding is an art form that not everyone possesses. Whether I have it or not, I can't say—I've never dived too deeply into programming to find out. If I ever do, Microsoft's new GitHub Copilot might prove to be a boon. What exactly is Copilot? It is an AI pair programmer that helps coders hammer out lines and functions faster and with less work, especially with the more mundane stuff. Developed in collaboration with OpenAI, an AI researcher company co-founded by Elon Musk and backed by Microsoft (by way of a $1 billion investment in 2019), Copilot taps into the power of artificial intelligence... Read more...
NVIDIA DLSS is an AI rendering technology used to increase performance while allowing players to still enjoy high-fidelity and quality in games. As of late, this technology has been added to a myriad of games, such as Rainbow Six: Siege earlier this year. Now, the tech is slated to come to five new game titles in the coming weeks. Announced today, NVIDIA DLSS will make its way to Facepunch Studio's multiplayer survival game Rust and becomes available on July 1st. Rust's Project Lead, Helk, explained that this is important because, "in Rust split-second reactions can be the difference between life and death, with NVIDIA DLSS offering our players a performance boost, without sacrificing visual... Read more...
While there may be funny and impressive deepfakes out there, the technology poses a risk to trustworthy media and public figures. However, companies are working to prevent this sort of thing by developing robust deep fake detection tools. Facebook is the latest to join the group through reverse engineering generative models from a single deepfake. As time goes on, figuring out whether an image is real or fake has become increasingly difficult as new deepfake generative models are created. For example, things could end poorly if the U.S. President were deepfaked into a video where he says something defamatory about another country, which could spark hostilities or even war if not identified. To... Read more...
Battlefield 2042 has been announced and shown off through game engine and gameplay footage during the past week. More recently, EA performed its first Battlefield Briefing, which gave a closer look at the upcoming game. In that briefing, which was recently updated, we found out that Battlefield 2042 will be introducing artificial intelligence soldiers to the series. One of the most frustrating things about playing games is when you want to play, but the player base is not what it once was. Sometimes all you can find are empty servers, which can also prove to be very disappointing. However, Battlefield 2042 is bringing AI soldiers to the battlefield to fill up those empty slots to combat this... Read more...
For years, we have known that AI can beat humans at games like Go, Starcraft, and up to 57 different Atari 2600 games. However, it is commonly believed that AI cannot beat humans at designing things, but that is not necessarily the case. Google has decided to use AI to design microchips for its next generation of Google Tensor Processing Unit, and it can be done in 6 hours, whereas humans can take several months or longer. When it comes to designing computer chips, one of the more laborious yet highly important tasks is laying out the components in what is called a chip floorplan. Where all the physical parts go can have a massive impact, affecting power consumption, performance, and chip area.... Read more...
1 2 3 4 5 Next ... Last