FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇IEEE Spectrum
  • IEEE Offers New Transportation Platform With Advanced Analytics ToolsKathy Pretz
    To help find ways to solve transportation issues such as poorly maintained roads, traffic jams, and the high rate of accidents, researchers need access to the most current datasets on a variety of topics. But tracking down information about roadway conditions, congestion, and other statistics across multiple websites can be time-consuming. Plus, the data isn’t always accurate.The new National Transportation Data & Analytics Solution (NTDAS), developed with the help of IEEE, makes it easier t
     

IEEE Offers New Transportation Platform With Advanced Analytics Tools

5. Červen 2024 v 20:00


To help find ways to solve transportation issues such as poorly maintained roads, traffic jams, and the high rate of accidents, researchers need access to the most current datasets on a variety of topics. But tracking down information about roadway conditions, congestion, and other statistics across multiple websites can be time-consuming. Plus, the data isn’t always accurate.

The new National Transportation Data & Analytics Solution (NTDAS), developed with the help of IEEE, makes it easier to retrieve, visualize, and analyze data in one place. NTDAS combines advanced research tools with access to high-quality transportation datasets from the U.S. Federal Highway Administration’s National Highway System and the entire Traffic Message Channel network, which distributes information on more than 1 million road segments. Anonymous data on millions of cars and trucks is generated from vehicle probes, which are vehicles equipped with GPS or global navigation satellite systems that gather traffic data on location, speed, and direction. This information helps transportation planners improve traffic flow, make transportation networks more efficient, and plan budgets.

The platform is updated monthly and contains archival data back to 2017.

“The difference between NTDAS and other competitors is that our data comes from a trusted source that means the most: the U.S. Federal Highway Administration,” says Lavanya Sayam, senior manager of data analytics alliances and programs for IEEE Global Products and Marketing. “The data has been authenticated and validated. The ability to download this massive dataset provides an unparalleled ease to data scientists and machine-learning engineers to explore and innovate.”

IEEE is diversifying its line of products beyond its traditional fields of electrical engineering, Sayam adds. “We are not just focused on electrical or computer science,” she says. “IEEE is so diverse, and this state-of-the-art platform reflects that.”

Robust analytical tools

NTDAS was built in partnership with INRIX, a transportation analytics solutions provider, and the University of Maryland’s Center for Advanced Transportation Technology Laboratory, a leader in transportation science research. INRIX provided the data, while UMD built the analytics tools. The platform leverages the National Performance Management Research Data Set, a highly granular data source from the Federal Highway Administration.

The suite of tools allows users to do tasks such as creating a personal dashboard to monitor traffic conditions on specific roads, downloading raw data for analysis, building animated maps of road conditions, and measuring the flow of traffic. There are tutorials available on the platform on how to use each tool, and templates for creating reports, documents, and pamphlets.

“The difference between National Transportation Data & Analytics Solutions and other competitors is that our data comes from a trusted source that means the most: the U.S. Federal Highway Administration.” —Lavanya Sayam

“This is the first time this type of platform is being offered by IEEE to the global academic institutional audience,” she says. “IEEE is always looking for new ways to serve the engineering community.”

A subscription-based service, NTDAS has multidisciplinary relevance, Sayam says. The use cases it includes serve researchers and educators who need a robust platform that has all the data that helps them conduct analytics in one place, she says. For university instructors, it’s an innovative way to teach the courses, and for students, it’s a unique way to apply what they’ve learned with real-world data and uses.

The platform goes beyond just those working in transportation, Sayam notes. Others who might find NTDAS useful include those who study traffic as it relates to sustainability, the environment, civil engineering, public policy, business, and logistics, she adds.

50 ways to minimize the impact of traffic

NTDAS also includes more than 50 use cases created by IEEE experts to demonstrate how the data could be analyzed. The examples identify ways to protect the environment, better serve disadvantaged communities, support alternative transportation, and improve the safety of citizens. “Data from NTDAS can be easily extrapolated to non-U.S. geographies, making it highly relevant to global researchers,” according to Sayam. This is explained in specific use cases too.

The cases cover topics such as the impact of traffic on bird populations, air-quality issues in underserved communities, and optimal areas to install electric vehicle charging stations.

Two experts covered various strategies for how to use the data to analyze the impact of transportation and infrastructure on the environment in this on-demand webinar held in May.

Thomas Brennan, a professor of civil engineering at the College of New Jersey, discussed how using NTDAS data could aid in better planning of evacuation routes during wildfires, such as determining the location of first responders and traffic congestion in the area, including seasonal traffic. This and other data could lead to evacuating residents faster, new evacuation road signage, and better communication warning systems, he said.

“Traffic systems are super complex and very difficult to understand and model,” said presenter Jane MacFarlane, director of the Smart Cities and Sustainable Mobility Center at the University of California’s Institute of Transportation Studies, in Berkeley. “Now that we have datasets like these, that’s giving us a huge leg up in trying to use them for predictive modeling and also helping us with simulating things so that we can gain a better understanding.”

Watch this short demonstration about the National Transportation Data & Analytics Solutions platform.

“Transportation is a basic fabric of society,” Sayam says. “Understanding its impact is an imperative for better living. True to IEEE’s mission of advancing technology for humanity, NTDAS, with its interdisciplinary relevance, helps us understand the impact of transportation across several dimensions.”

  • ✇Semiconductor Engineering
  • AI For Data ManagementAdam Kovac
    Data management is becoming a significant new challenge for the chip industry, as well as a brand new opportunity, as the amount of data collected at every step of design through manufacturing continues to grow. Exacerbating the problem is the rising complexity of designs, many of which are highly customized and domain-specific at the leading edge, as well as increasing demands for reliability and traceability. There also is a growing focus on chiplets developed using different processes, includ
     

AI For Data Management

30. Květen 2024 v 09:03

Data management is becoming a significant new challenge for the chip industry, as well as a brand new opportunity, as the amount of data collected at every step of design through manufacturing continues to grow.

Exacerbating the problem is the rising complexity of designs, many of which are highly customized and domain-specific at the leading edge, as well as increasing demands for reliability and traceability. There also is a growing focus on chiplets developed using different processes, including some from different foundries, and new materials such as glass substrates and ruthenium interconnects. On the design side, EDA and verification tools can generate terabytes of data on a weekly or even a daily basis, unlike in the past when this was largely done on a per-project basis.

While more data can be used to provide insights into processes and enable better designs, it’s an ongoing challenge to manage the current volumes being generated. The entire industry must rethink some well-proven methodologies and processes, as well as invest in a variety of new tools and approaches. At the same time, these changes are generating concern in an industry used to proceeding cautiously, one step at a time, based on silicon- and field-proven strategies. Increasingly, AI/ML is being added into design tools to identify anomalies and patterns in large data sets, and many of those tools are being regularly updated as algorithms are updated and new features are added, making it difficult to know exactly when and where to invest, which data to focus on, and with whom to share it.

“Every company has its own design flow, and almost every company has its own methodology around harvesting that data, or best practices about what reports should or should not be written out at what point,” said Rob Knoth, product management director in Cadence’s Digital & Signoff group. “There’s a death by 1,000 cuts that can happen in terms of just generating titanic volumes of data because, in general, disk space is cheap. People don’t think about it a lot, and they’ll just keep generating reports. The problem is that just because you’re generating reports doesn’t mean you’re using them.”

Fig. 1: Rising design complexity is driving increased need for data management. Source: IEEE Rising Stars 2022/Cadence

As with any problem in chip design, there is opportunity in figuring out a path forward. “You can always just not use the data, and then you’re back where you started,” said Tony Chan Carusone, CTO at Alphawave Semi. “The reason it becomes a problem for organizations is because they haven’t architected things from the beginning to be scalable, and therefore, to be able to handle all this data. Now, there’s an opportunity to leverage data, and it’s a different way. So it’s disruptive because you have to tear things apart, from re-architecting systems and processes to how you collect and store data, and organize it in order to take advantage of the opportunity.”

Buckets of data, buckets of problems
The challenges that come with this influx of data can be divided into three buckets, said Jim Schultz, senior staff product manager at Synopsys. The first is figuring out what information is actually critical to keep. “If you make a run, designers tend to save that run because if they need to do a follow up run, they have some data there and they may go, ‘Okay, well, what’s the runtime? How long did that run take, because my manager is going to ask me what I think the runtime is going to be on the next project or the next iteration of the block. While that data may not be necessary, designers and engineers have a tendency to hang onto it anyway, just in case.”

The second challenge is that once the data starts to pour in, it doesn’t stop, raising questions about how to manage collection. And third, once the data is collected, how can it be put to best use?

“Data analytics have been around with other types of companies exploring different types of data analytics, but the differences are those are can be very generic solutions,” said Schultz. “What we need for our industry is going to be very specific data analytics. If I have a timing issue, I want you to help me pinpoint what the cause of that timing violation is. That’s very specific to what we do in EDA. When we talk about who is cutting through the noise, we don’t want data that’s just presented. We want the data that is what the designer most cares about.”

Data security
The sheer number of tools being used and companies and people involved along the design pathway raises another challenge — security.

“There’s a lot of thought and investment going into the security aspect of data, and just as much as the problem of what data to save and store is the type of security we have to have without hindering the user day-to-day,” said Simon Rance, director of product management at Keysight. “That’s becoming a bigger challenge. Things like the CHIPS Act and the geopolitical scenarios we have at the moment are compounding that problem because a lot of the companies that used to create all these devices by themselves are having to collaborate, even with companies in different regions of the globe.”

This requires a balancing act. “It’s almost like a recording studio where you have all these knobs and dials to fine tune it, to make sure we have security of the data,” said Rance. “But we’re also able to get the job done as smoothly and as easily as we can.”

Further complicating the security aspect is that designing chips is not a one-man job. As leading-edge chips become increasingly complex and heterogeneous, they can involve hundreds of people in multiple companies.

“An important thing to consider when you’re talking about big data and analytics is what you’re going to share and with whom you’re going to share it,” said Synopsys’ Schultz. “In particular, when you start bringing in and linking data from different sources, if you start bringing in data related to silicon performance, you don’t want everybody to have access to that data. So the whole security protocol is important.”

Even the mundane matters — having a ton of data makes it likely, at some point, that data will be moved.

“The more places the data has to be transferred to, the more delays,” said Rance. “The bigger the data set, the longer it takes to go from A to B. For example, a design team in the U.S. may be designing during the day. Then, another team in Singapore or Japan will pick up on that design in their time zone, but they’re across the world. So you’re going to have to sync the data back and forth between these kinds of design sites. The bigger the data, the harder to sync.”

Solutions
The first step toward solving the issue of too much data is figuring out what data is actually needed. Rance said his team has found success using smart algorithms that help figure out which data is essential, which in turn can help optimize storage and transfer times.

There are less technical problems that can rear their heads, as well. Gina Jacobs, head of global communications and brand marketing at Arteris, said that engineers who use a set methodology — particularly those who are used to working on a problem by themselves and “brute forcing” a solution – also can find themselves overwhelmed by data.

“Engineers and designers can also switch jobs, taking with them institutional knowledge,” Jacobs said. “But all three problems can be solved with a single solution — having data stored in a standardized way that is easily accessible and sortable. It’s about taking data and requirements and specifications in different forms and then having it in the one place so that the different teams have access to it, and then being able to make changes so there is a single source of truth.”

Here, EDA design and data management tools are increasingly relying on artificial intelligence to help. Schultz forecasted a future where generative AI will touch every facet of chip development. “Along with that is the advanced data analytics that is able to mine all of that data you’ve been collecting, instead of going beyond the simple things that people have been doing, like predicting how long runtime is going to be or getting an idea what the performance is going to be,” he said. “Tools are going to be able to deal with all of that data and recognize trends much faster.”

Still, those all-encompassing AI tools, capable of complex analysis, are still years away. Cadence’s Knoth said he’s already encountered clients that are reluctant to bring it into the mix due to fears over the costs involved in disk space, compute resources, and licenses. Others, however, have been a bit more open-minded.

“Initially, AI can use a lot of processors to generate a lot of data because it’s doing a lot of things in parallel when it’s doing the inferencing, but it usually gets to the result faster and more predictably,” he said. So while a machine learning algorithm may generate even more vast amounts of data, on top of the piles currently available, “a good machine learning algorithm could be watching and smartly killing or restarting jobs where needed.”

As for the humans who are still an essential component to chip design, Alphawave’s Carusone said hardware engineers should take a page from lessons learned years ago from their counterparts in the software development world.

These include:

  • Having an organized and automated way to collect data, file it in a repository, and not do anything manually;
  • Developing ways to run verification and lab testing and everything in between in parallel, but with the data organized in a way that can be mined; and
  • Creating methods for rigorously checking in and out of different test cases that you want to consider.

“The big thing is you’ve got all this data collected, but then what is each of each of those files, each of those collections of data?” said Carusone. “What does that correspond to? What test conditions was that collected in? The software community dealt with that a while ago, and the hardware community also needs to have this under its belt, taking it to the next level and recognizing we really need to be able to do this en masse. We need to be able to have dozens of people work in parallel, collecting data and have it all on there. We can test a big collection of our designs in the lab without anyone having to touch a thing, and then also try refinements of the firmware, scale them out, then have all the data come in and be analyzed. Being able to have all that done in an automated way lets you track down and fix problems a lot more quickly.”

Conclusion
The influx of new tools used to analyze and test chip designs has increased productivity, but those designs come with additional considerations. Institutions and individual engineers and designers have never had access to so much data, but that data is of limited value if it’s not used effectively.

Strategies to properly store and order that data are essential. Some powerful tools are already in place to help do that, and the AI revolution promises to make even more powerful resources available to quickly cut down on the time needed to run tests and analyze the results.

For now, handling all that data remains a tricky balance, according to Cadence’s Knoth. “If this was an easy problem, it wouldn’t be a problem. Being able to communicate effectively, hierarchically — not just from a people management perspective, but also hierarchically from a chip and project management perspective — is difficult. The teams that do this well invest resources into that process, specifically the communication of top-down tightening of budgets or top-down floorplan constraints. These are important to think about because every engineer is looking at chip-level timing reports, but the problem that they’re trying to solve might not ever be visible. But if they have a report that says, ‘Here is your view of what your problems are to solve,’ you can make some very effective work.”

Further Reading
EDA Pushes Deeper Into AI
AI is both evolutionary and revolutionary, making it difficult to assess where and how it will be used, and what problems may crop up.
Optimizing EDA Cloud Hardware And Workloads
Algorithms written for GPUs can slice simulation time from weeks to hours, but not everything is optimized or benefits equally.

The post AI For Data Management appeared first on Semiconductor Engineering.

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated

6. Březen 2024 v 16:05
pAmerican communities are being confronted by a lot of new police technology these days, a lot of which involves surveillance or otherwise raises the question: “Are we as a community comfortable with our police deploying this new technology?” A critical question when addressing such concerns is: “Does it even work, and if so, how well?” It’s hard for communities, their political leaders, and their police departments to know what to buy if they don’t know what works and to what degree./p pOne thing I’ve learned from following new law enforcement technology for over 20 years is that there is an awful lot of snake oil out there. When a new capability arrives on the scene—whether it’s a href=https://www.aclu.org/wp-content/uploads/publications/drawing_blank.pdfface recognition/a, a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/emotion recognition/a, a href=https://www.aclu.org/wp-content/uploads/publications/061819-robot_surveillance.pdfvideo analytics/a, or “a href=https://www.aclu.org/news/privacy-technology/chicago-police-heat-list-renews-old-fears-aboutbig data/a” pattern analysis—some companies will rush to promote the technology long before it is good enough for deployment, which sometimes a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/never happens/a. That may be even more true today in the age of artificial intelligence. “AI” is a term that often amounts to no more than trendy marketing jargon./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank Six Questions to Ask Before Accepting a Surveillance Technology /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletCommunity members, policymakers, and political leaders can make better decisions about new technology by asking these questions./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pGiven all this, communities and city councils should not adopt new technology that has not been subject to testing and evaluation by an independent, disinterested party. That’s true for all types of technology, but doubly so for technologies that have the potential to change the balance of power between the government and the governed, like surveillance equipment. After all, there’s no reason to get a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technologywrapped up in big debates/a about privacy, security, and government power if the tech doesn’t even work./p pOne example of a company refusing to allow independent review of its product is the license plate recognition company Flock, which is pushing those surveillance devices into many American communities and tying them into a centralized national network. (We wrote more about this company in a 2022 a href=https://www.aclu.org/publications/fast-growing-company-flock-building-new-ai-driven-mass-surveillance-systemwhite paper/a.) Flock has steadfastly refused to allow the a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingsindependent/a security technology reporting and testing outlet a href=https://ipvm.com/IPVM/a to obtain one of its license plate readers for testing, though IPVM has tested all of Flock’s major competitors. That doesn’t stop Flock from a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453boasting/a that “Flock Safety technology is best-in-class, consistently performing above other vendors.” Claims like these are puzzling and laughable when the company doesn’t appear to have enough confidence in its product to let IPVM test it./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 img width=1160 height=768 src=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg 1160w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-768x508.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-400x265.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-600x397.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-800x530.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-1000x662.jpg 1000w sizes=(max-width: 1160px) 100vw, 1160px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank Experts Say 'Emotion Recognition' Lacks Scientific Foundation /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pCommunities considering installing Flock cameras should take note. That is especially the case when errors by Flock and other companies’ license plate readers can lead to innocent drivers finding themselves with their a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453hands behind their heads/a, facing jittery police pointing guns at them. Such errors can also expose police departments and cities to lawsuits./p pEven worse is when a company pretends that its product has been subject to independent review when it hasn’t. The metal detector company Evolv, which sells — wait for it — emAI/em metal detectors, submitted its technology to testing by a supposedly independent lab operated by the University of Southern Mississippi, and publicly touted the results of the tests. But a href=https://ipvm.com/reports/bbc-evolvIPVM/a and the a href=https://www.bbc.com/news/technology-63476769BBC/a reported that the lab, the National Center for Spectator Sports Safety and Security (a href=https://ncs4.usm.edu/NCS4/a), had colluded with Evolv to manipulate the report and hide negative findings about the effectiveness of the company’s product. Like Flock, Evolv refuses to allow IPVM to obtain one of its units for testing. (We wrote about Evolv and its product a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingshere/a.)/p pOne of the reasons these companies can prevent a tough, independent reviewer such as IPVM from obtaining their equipment is their subscription and/or cloud-based architecture. “Most companies in the industry still operate on the more traditional model of having open systems,” IPVM Government Research Director Conor Healy told me. “But there’s a rise in demand for cloud-based surveillance, where people can store things in cloud, access them on their phone, see the cameras. Cloud-based surveillance by definition involves central control by the company that’s providing the cloud services.” Cloud-based architectures can a href=https://www.aclu.org/news/civil-liberties/major-hack-of-camera-company-offers-four-key-lessons-on-surveillanceworsen the privacy risks/a created by a surveillance system. Another consequence of their centralized control is increasing the ability of a company to control who can carry out an independent review./p pWe’re living in an era where a lot of new technology is emerging, with many companies trying to be the first to put them on the market. As Healy told me, “We see a lot of claims of AI, all the time. At this point, almost every product I see out there that gets launched has some component of AI.” But like other technologies before them, these products often come in highly immature, premature, inaccurate, or outright deceptive forms, relying little more than on the use of “AI” as a buzzword./p pIt’s vital for independent reviewers to contribute to our ongoing local and national conversations about new surveillance and other police technologies. It’s unclear why a company that has faith in its product would attempt to block independent review, which is all the more reason why buyers should know this about those companies./p div class=rss-ctadiv class=rss-cta__subtitleWhat you can do:/divdiv class=rss-cta__titleStop mass warrantless surveillance: End Section 702/diva href=https://action.aclu.org/send-message/stop-mass-warrantless-surveillance-end-section-702 class=rss-cta__buttonSend your message/a/div
❌
❌