FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇AnandTech
  • MediaTek to Add NVIDIA G-Sync Support to Monitor Scalers, Make G-Sync Displays More Accessible
    NVIDIA on Tuesday said that future monitor scalers from MediaTek will support its G-Sync technologies. NVIDIA is partnering with MediaTek to integrate its full range of G-Sync technologies into future monitors without requiring a standalone G-Sync module, which makes advanced gaming features more accessible across a broader range of displays. Traditionally, G-Sync technology relied on a dedicated G-sync module – based on an Altera FPGA – to handle syncing display refresh rates with the GPU in o
     

MediaTek to Add NVIDIA G-Sync Support to Monitor Scalers, Make G-Sync Displays More Accessible

20. Srpen 2024 v 23:30

NVIDIA on Tuesday said that future monitor scalers from MediaTek will support its G-Sync technologies. NVIDIA is partnering with MediaTek to integrate its full range of G-Sync technologies into future monitors without requiring a standalone G-Sync module, which makes advanced gaming features more accessible across a broader range of displays.

Traditionally, G-Sync technology relied on a dedicated G-sync module – based on an Altera FPGA – to handle syncing display refresh rates with the GPU in order to reduce screen tearing, stutter, and input lag. As a more basic solution, in 2019 NVIDIA introduced G-Sync Compatible certification and branding, which leveraged the industry-standard VESA AdaptiveSync technology to handle variable refresh rates. In lieu of using a dedicated module, leveraging AdaptiveSync allowed for cheaper monitors, with NVIDIA's program serving as a stamp of approval that the monitor worked with NVIDIA GPUs and met NVIDIA's performance requirements. Still, G-Sync Compatible monitors still lack some features that, to date, require the dedicated G-Sync module.

Through this new partnership with MediaTek, MediaTek will bring support for all of NVIDIA's G-Sync technologies, including the latest G-Sync Pulsar, directly into their scalers. G-Sync Pulsar enhances motion clarity and reduces ghosting, providing a smoother gaming experience. In addition to variable refresh rates and Pulsar, MediaTek-based G-Sync displays will support such features as variable overdrive, 12-bit color, Ultra Low Motion Blur, low latency HDR, and Reflex Analyzer. This integration will allow more monitors to support a full range of G-Sync features without having to incorporate an expensive FPGA.

The first monitors to feature full G-Sync support without needing an NVIDIA module include the AOC Agon Pro AG276QSG2, Acer Predator XB273U F5, and ASUS ROG Swift 360Hz PG27AQNR. These monitors offer 360Hz refresh rates, 1440p resolution, and HDR support.

What remains to be seen is which specific MediaTek's scalers will support NVIDIA's G-Sync technology – or if the company is going to implement support into all of their scalers going forward. It also remains to be seen whether monitors with NVIDIA's dedicated G-Sync modules retain any advantages over displays with MediaTek's scalers.

  • ✇XDA
  • Do Nvidia graphics cards work with AMD FreeSync?Adam Conway
    The best graphics cards these days have much more to offer than just pure performance. Features like ray tracing and DLSS have taken the PC gaming scene by storm, but it's the simple stuff like adaptive sync that's more important. Screen tearing is a major issue when your monitor's refresh rate and in-game frame rate are not synchronized. Fortunately, many of the best gaming monitors use AMD's FreeSync or Nvidia's G-sync to combat this problem. While FreeSync and G-sync will
     

Do Nvidia graphics cards work with AMD FreeSync?

21. Srpen 2024 v 00:30

The best graphics cards these days have much more to offer than just pure performance. Features like ray tracing and DLSS have taken the PC gaming scene by storm, but it's the simple stuff like adaptive sync that's more important. Screen tearing is a major issue when your monitor's refresh rate and in-game frame rate are not synchronized. Fortunately, many of the best gaming monitors use AMD's FreeSync or Nvidia's G-sync to combat this problem. While FreeSync and G-sync will work with both AMD and Nvidia GPUs, there are a couple of caveats.

Nvidia's new partnership with MediaTek has just killed the module which made G-Sync monitors so damned expensive

20. Srpen 2024 v 15:10

Nvidia has announced it has partnered with MediaTek to produce a scaler chip for gaming monitors, that has the full G-Sync feature set built-in. Rather than having to use a separate G-Sync module, display vendors can now use this single chip to bring Nvidia's variable refresh rate system to more products.

In 2013, Nvidia launched G-Sync, a system that allows monitors to vary the refresh rate so that when the GPU has finished rendering a frame, the display can instantly display it instead of having to wait. In that situation, there's a risk of the frame being swapped during the display process, resulting in a 'tear' across the screen.

VRR also greatly reduces any stuttering induced by differences between the display's refresh rate and a game's frame rate.

G-Sync isn't the only variable refresh rate (VRR) technology though, as DisplayPort 1.2 or newer, and HDMI 2.1 both feature it. AMD also has a VRR system called FreeSync, which is based on the DisplayPort version, though it has been substantially improved since it first appeared in 2015.

FreeSync doesn't require any additional hardware inside the monitor, just that the display has to be able to adjust its refresh rate over a given range (e.g. between 30 Hz and 144 Hz). However, if a monitor manufacturer wants to offer full G-Sync support, it needs to purchase and fit a separate add-in board, with Nvidia's G-Sync chip and a little bit of RAM.

That adds to the bill of materials and since FreeSync is also royalty-free, vendors such as Asus, Acer, Gigabyte, MSI, et al have preferred to go with AMD's system (especially since it works with AMD, Intel, and Nvidia GPUs).

Hence why Nvidia has teamed up with MediaTek to produce a scaler chip, with the full G-Sync feature set built into it, including the latest Pulsar technology—this is a system to reduce motion blur, keeping small details as clear as possible, even when whipping the camera about in a game.

Three vendors—Acer, AOC, and Asus—have already announced some gaming monitors that will use the chip. They're all 27-inch, 1440p gaming monitors with a maximum refresh rate of 360 Hz.

Image 1 of 4

Nvidia presentation image for the Nvidia-MediaTek G-Sync collaboration

(Image credit: Nvidia)
Image 2 of 4

Nvidia presentation image for the Nvidia-MediaTek G-Sync collaboration

(Image credit: Nvidia)
Image 3 of 4

Nvidia presentation image for the Nvidia-MediaTek G-Sync collaboration

(Image credit: Nvidia)
Image 4 of 4

Nvidia presentation image for the Nvidia-MediaTek G-Sync collaboration

(Image credit: Nvidia)

There's no word on how expensive the Predator XB273U F5, Agon Pro AG276QS2, and ROG Swift PG27AQNR will be, or when they will be available to buy, though I should imagine that an announcement will be made soon enough.

The more important question to ask, regardless of the price tag, is "Why should I buy a G-Sync monitor instead of a FreeSync one?" On paper, there's little to separate the two technologies and it comes down to the individual application of each one in a gaming monitor.

A display that supports G-Sync Ultimate will meet a certain level of hardware capabilities, whereas you're not necessarily guaranteed that with a standard FreeSync one. AMD does have FreeSync Premium, with higher specifications and more features than the original FreeSync, though.

Screen queens

(Image credit: Future)

Best gaming monitor: Pixel-perfect panels for your PC.
Best high refresh rate monitor: Screaming quick.
Best 4K monitor for gaming: When only high-res will do.
Best 4K TV for gaming: Big-screen 4K gaming.

While you might think that having a GeForce RTX graphics card requires you to be using a G-Sync gaming monitor, the reality is far from being that specific. Nvidia has a list of monitors that it certifies as having G-Sync Ultimate or standard G-Sync features, or just being G-Sync Compatible (i.e. it's a FreeSync monitor but it will work with Nvidia GPUs).

That said, G-Sync Ultimate monitors do have a very wide VRR range, typically from 1 Hz up to the monitor's maximum refresh rate, whereas G-Sync Compatible displays have a narrower range, e.g. 48 Hz to 144 Hz. With those monitors, if a game's frame rate drops below the lowest value in the VRR range, it might activate blur reduction, LFC (Low Framerate Compensation), or simply double the refresh rate to keep things in the VRR range.

With these new MediaTek G-Sync equipped monitors, you should get the complete G-Sync Ultimate features without the big cost of having to pay for the additional G-Sync module. How all of this pans out in the real world… well, we'll let you know when we get one in for review!

© Nvidia

  • ✇Semiconductor Engineering
  • AI/ML’s Role In Design And Test ExpandsLaura Peters
    The role of AI and ML in test keeps growing, providing significant time and money savings that often exceed initial expectations. But it doesn’t work in all cases, sometimes even disrupting well-tested process flows with questionable return on investment. One of the big attractions of AI is its ability to apply analytics to large data sets that are otherwise limited by human capabilities. In the critical design-to-test realm, AI can address problems such as tool incompatibilities between the des
     

AI/ML’s Role In Design And Test Expands

5. Srpen 2024 v 09:03

The role of AI and ML in test keeps growing, providing significant time and money savings that often exceed initial expectations. But it doesn’t work in all cases, sometimes even disrupting well-tested process flows with questionable return on investment.

One of the big attractions of AI is its ability to apply analytics to large data sets that are otherwise limited by human capabilities. In the critical design-to-test realm, AI can address problems such as tool incompatibilities between the design set-up, simulation, and ATE test program, which typically slows debugging and development efforts. Some of the most time-consuming and costly aspects of design-to-test arise from incompatibilities between tools.

“During device bring-up and debug, complex software/hardware interactions can expose the need for domain knowledge from multiple teams or stakeholders, who may not be familiar with each other’s tools,” said Richard Fanning, lead software engineer at Teradyne. “Any time spent doing conversions or debugging differences in these set-ups is time wasted. Our toolset targets this exact problem by allowing all set-ups to use the same set of source files so everyone can be sure they are running the same thing.”

ML/AI can help keep design teams on track, as well. “As we drive down this technology curve, the analytics and the compute infrastructure that we have to bring to bear becomes increasingly more complex and you want to be able to make the right decision with a minimal amount of overkill,” said Ken Butler, senior director of business development in the ACS data analytics platform group at Advantest. “In some cases, we are customizing the test solution on a die-by-die type of basis.”

But despite the hype, not all tools work well in every circumstance. “AI has some great capabilities, but it’s really just a tool,” said Ron Press, senior director of technology enablement at Siemens Digital Industries Software, in a recent presentation at a MEPTEC event. “We still need engineering innovation. So sometimes people write about how AI is going to take away everybody’s job. I don’t see that at all. We have more complex designs and scaling in our designs. We need to get the same work done even faster by using AI as a tool to get us there.”

Speeding design to characterization to first silicon
In the face of ever-shrinking process windows and the lowest allowable defectivity rates, chipmakers continually are improving the design-to-test processes to ensure maximum efficiency during device bring-up and into high volume manufacturing. “Analytics in test operations is not a new thing. This industry has a history of analyzing test data and making product decisions for more than 30 years,” said Advantest’s Butler. “What is different now is that we’re moving to increasingly smaller geometries, advanced packaging technologies and chiplet-based designs. And that’s driving us to change the nature of the type of analytics that we do, both in terms of the software and the hardware infrastructure. But from a production test viewpoint, we’re still kind of in the early days of our journey with AI and test.”

Nonetheless, early adopters are building out the infrastructure needed for in-line compute and AI/ML modeling to support real-time inferencing in test cells. And because no one company has all the expertise needed in-house, partnerships and libraries of applications are being developed with tool-to-tool compatibility in mind.

“Protocol libraries provide out-of-the-box solutions for communicating common protocols. This reduces the development and debug effort for device communication,” said Teradyne’s Fanning. “We have seen situations where a test engineer has been tasked with talking to a new protocol interface, and saved significant time using this feature.”

In fact, data compatibility is a consistent theme, from design all the way through to the latest developments in ATE hardware and software. “Using the same test sequences between characterization and production has become key as the device complexity has increased exponentially,” explained Teradyne’s Fanning. “Partnerships with EDA tool and IP vendors is also key. We have worked extensively with industry leaders to ensure that the libraries and test files they output are formats our system can utilize directly. These tools also have device knowledge that our toolset does not. This is why the remote connect feature is key, because our partners can provide context-specific tools that are powerful during production debug. Being able to use these tools real-time without having to reproduce a setup or use case in a different environment has been a game changer.”

Serial scan test
But if it seems as if all the configuration changes are happening on the test side, it’s important to take stock of substantial changes on the approach to multi-core design for test.

Tradeoffs during the iterative process of design for test (DFT) have become so substantial in the case of multi-core products that a new approach has become necessary.

“If we look at the way a design is typically put together today, you have multiple cores that are going to be produced at different times,” said Siemens’ Press. “You need to have an idea of how many I/O pins you need to get your scan channels, the deep serial memory from the tester that’s going to be feeding through your I/O pins to this core. So I have a bunch of variables I need to trade off. I have the number of pins going to the core, the pattern size, and the complexity of the core. Then I’ll try to figure out what’s the best combination of cores to test together in what is called hierarchical DFT. But as these designs get more complex, with upwards of 2,500 cores, that’s a lot of tradeoffs to figure out.”

Press noted that applying AI with the same architecture can provide a 20% to 30% higher efficiency, but an improved methodology based on packetized scan test (see figure 1) actually makes more sense.


Fig. 1: Advantages to the serial scan network (SSN) approach. Source: Siemens

“Instead of having tester channels feeding into the scan channels that go to each core, you have a packetized bus and packets of data that feed through all the cores. Then you instruct the cores when their packet information is going to be available. By doing this, you don’t have as many variables you need to trade off,” he said. At the core level, each core can be optimized for any number of scan channels and patterns, and the I/O pin count is no longer a variable in the calculation. “Then, when you put it into this final chip, it deliver from the packets the amount of data you need for that core, that can work with any size serial bus, in what is called a serial scan network (SSN).”

Some of the results reported by Siemens EDA customers (see figure 2) highlight both supervised and unsupervised machine learning implementation for improvements in diagnosis resolution and failure analysis. DFT productivity was boosted by 5 to 10X using the serial scan network methodology.


Fig. 2: Realized benefits using machine learning and the serial scan network approach. Source: Siemens

What slows down AI implementation in HVM?
In the transition from design to testing of a device, the application of machine learning algorithms can enable a number of advantages, from better pairing of chiplet performance for use in an advanced package to test time reduction. For example, only a subset of high-performing devices may require burn-in.

“You can identify scratches on wafers, and then bin out the dies surrounding those scratches automatically within wafer sort,” said Michael Schuldenfrei, fellow at NI/Emerson Test & Measurement. “So AI and ML all sounds like a really great idea, and there are many applications where it makes sense to use AI. The big question is, why isn’t it really happening frequently and at-scale? The answer to that goes into the complexity of building and deploying these solutions.”

Schuldenfrei summarized four key steps in ML’s lifecycle, each with its own challenges. In the first phase, the training, engineering teams use data to understand a particular issue and then build a model that can be used to predict an outcome associated with that issue. Once the model is validated and the team wants to deploy it in the production environment, it needs to be integrated with the existing equipment, such as a tester or manufacturing execution system (MES). Models also mature and evolve over time, requiring frequent validation of the data going into the model and checking to see that the model is functioning as expected. Models also must adapt, requiring redeployment, learning, acting, validating and adapting, in a continuous circle.

“That eats up a lot of time for the data scientists who are charged with deploying all these new AI-based solutions in their organizations. Time is also wasted in the beginning when they are trying to access the right data, organizing it, connecting it all together, making sense of it, and extracting features from it that actually make sense,” said Schuldenfrei.

Further difficulties are introduced in a distributed semiconductor manufacturing environment in which many different test houses are situated in various locations around the globe. “By the time you finish implementing the ML solution, your model is stale and your product is probably no longer bleeding edge so it has lost its actionability, when the model needs to make a decision that actually impacts either the binning or the processing of that particular device,” said Schuldenfrei. “So actually deploying ML-based solutions in a production environment with high-volume semiconductor test is very far from trivial.”

He cited a 2014 Google article that stated how the ML code development part of the process is both the smallest and easiest part of the whole exercise, [1] whereas the various aspects of building infrastructure, data collection, feature extraction, data verification, and managing model deployments are the most challenging parts.

Changes from design through test ripple through the ecosystem. “People who work in EDA put lots of effort into design rule checking (DRC), meaning we’re checking that the work we’ve done and the design structure are safe to move forward because we didn’t mess anything up in the process,” said Siemens’ Press. “That’s really important with AI — what we call verifiability. If we have some type of AI running and giving us a result, we have to make sure that result is safe. This really affects the people doing the design, the DFT group and the people in test engineering that have to take these patterns and apply them.”

There are a multitude of ML-based applications for improving test operations. Advantest’s Butler highlighted some of the apps customers are pursuing most often, including search time reduction, shift left testing, test time reduction, and chiplet pairing (see figure 3).

“For minimum voltage, maximum frequency, or trim tests, you tend to set a lower limit and an upper limit for your search, and then you’re going to search across there in order to be able to find your minimum voltage for this particular device,” he said. “Those limits are set based on process split, and they may be fairly wide. But if you have analytics that you can bring to bear, then the AI- or ML-type techniques can basically tell you where this die lies on the process spectrum. Perhaps it was fed forward from an earlier insertion, and perhaps you combine it with what you’re doing at the current insertion. That kind of inference can help you narrow the search limits and speed up that test. A lot of people are very interested in this application, and some folks are doing it in production to reduce search time for test time-intensive tests.”


Fig. 3: Opportunities for real-time and/or post-test improvements to pair or bin devices, improve yield, throughput, reliability or cost using the ACS platform. Source: Advantest

“The idea behind shift left is perhaps I have a very expensive test insertion downstream or a high package cost,” Butler said. “If my yield is not where I want it to be, then I can use analytics at earlier insertions to be able to try to predict which devices are likely to fail at the later insertion by doing analysis at an earlier insertion, and then downgrade or scrap those die in order to optimize downstream test insertions, raising the yield and lowering overall cost. Test time reduction is very simply the addition or removal of test content, skipping tests to reduce cost. Or you might want to add test content for yield improvement,” said Butler.

“If I have a multi-tiered device, and it’s not going to pass bin 1 criteria – but maybe it’s bin 2 if I add some additional content — then people may be looking at analytics to try to make those decisions. Finally, two things go together in my mind, this idea of chiplet designs and smart pairing. So the classic example is a processor die with a stack of high bandwidth memory on top of it. Perhaps I’m interested in high performance in some applications and low power in others. I want to be able to match the content and classify die as they’re coming through the test operation, and then downstream do pick-and-place and put them together in such a way that I maximize the yield for multiple streams of data. Similar kinds of things apply for achieving a low power footprint and carbon footprint.”

Generative AI
The question that inevitably comes up when discussing the role of AI in semiconductors is whether or not large language models like ChatGPT can prove useful to engineers working in fabs. Early work shows some promise.

“For example, you can ask the system to build an outlier detection model for you that looks for parts that are five sigma away from the center line, saying ‘Please create the script for me,’ and the system will create the script. These are the kinds of automated, generative AI-based solutions that we’re already playing with,” says Schuldenfrei. “But from everything I’ve seen so far, there is still quite a lot of work to be done to get these systems to provide outputs with high enough quality. At the moment, the amount of human interaction that is needed afterward to fix problems with the algorithms or models that generative AI is producing is still quite significant.”

A lingering question is how to access the test programs needed to train the new test programs when everyone is protecting important test IP? “Most people value their test IP and don’t necessarily want to set up guardrails around the training and utilization processes,” Butler said. “So finding a way to accelerate the overall process of developing test programs while protecting IP is the challenge. It’s clear this kind of technology is going to be brought to bear, just like we already see in the software development process.”

Failure analysis
Failure analysis is typically a costly and time-consuming endeavor for fabs because it requires a trip back in time to gather wafer processing, assembly, and packaging data specific to a particular failed device, known as a returned material authorization (RMA). Physical failure analysis is performed in an FA lab, using a variety of tools to trace the root cause of the failure.

While scan diagnostic data has been used for decades, a newer approach involves pairing a digital twin with scan diagnostics data to find the root cause of failures.

“Within test, we have a digital twin that does root cause deconvolution based on scan failure diagnosis. So instead of having to look at the physical device and spend time trying to figure out the root cause, since we have scan, we have millions and millions of virtual sample points,” said Siemens’ Press. “We can reverse-engineer what we did to create the patterns and figure out where the mis-compare happened within the scan cells deep within the design. Using YieldInsight and unsupervised machine learning with training on a bunch of data, we can very quickly pinpoint the fail locations. This allows us to run thousands, or tens of thousands fail diagnoses in a short period of time, giving us the opportunity to identify the systematic yield limiters.”

Yet another approach that is gaining steam is using on-die monitors to access specific performance information in lieu of physical FA. “What is needed is deep data from inside the package to monitor performance and reliability continuously, which is what we provide,” said Alex Burlak, vice president of test and analytics at proteanTecs. “For example, if the suspected failure is from the chiplet interconnect, we can help the analysis using deep data coming from on-chip agents instead of taking the device out of context and into the lab (where you may or may not be able to reproduce the problem). Even more, the ability to send back data and not the device can in many cases pinpoint the problem, saving the expensive RMA and failure analysis procedure.”

Conclusion
The enthusiasm around AI and machine learning is being met by robust infrastructure changes in the ATE community to accommodate the need for real-time inferencing of test data and test optimization for higher yield, higher throughput, and chiplet classifications for multi-chiplet packages. For multi-core designs, packetized test, commercialized as an SSN methodology, provides a more flexible approach to optimizing each core for the number of scan chains, patterns and bus width needs of each core in a device.

The number of testing applications that can benefit from AI continues to rise, including test time reduction, Vmin/Fmax search reduction, shift left, smart pairing of chiplets, and overall power reduction. New developments like identical source files for all setups across design, characterization, and test help speed the critical debug and development stage for new products.

Reference

  1. https://proceedings.neurips.cc/paper_files/paper/2015/file/86df7dcfd896fcaf2674f757a2463eba-Paper.pdf

The post AI/ML’s Role In Design And Test Expands appeared first on Semiconductor Engineering.

  • ✇PCGamer latest
  • Optoma UHZ55 projector review
    Finding yourself a gaming projector used to be a bit of an oxymoron, but today you can get projectors that speed along with ridiculous 240 Hz refresh rates, and a super low 4 ms input latency. The Optoma UHZ55 is one of those. It's built around competitive gaming, but it's not the only one out there today. It does a lot, and it does it well for the price, but the question is whether it stands up against the rest of the top tier projectors on our best gaming proje
     

Optoma UHZ55 projector review

5. Srpen 2024 v 15:21

Finding yourself a gaming projector used to be a bit of an oxymoron, but today you can get projectors that speed along with ridiculous 240 Hz refresh rates, and a super low 4 ms input latency. The Optoma UHZ55 is one of those. It's built around competitive gaming, but it's not the only one out there today. It does a lot, and it does it well for the price, but the question is whether it stands up against the rest of the top tier projectors on our best gaming projector roundup on specs other than speed. Because, let's face it, not everyone needs a superspeed projector. And not everyone has the cash to spend on often-wasted refresh rates.

As it turns out, it's a pretty close call. But there's a couple of things holding the Optoma UHZ55 back.

Out of the box, the Optoma UHZ55 is pretty understated. No flashy design, just your average black projector with a hatch of wide slits on the sides for exhausting all that air. And it needs them, because this thing really blows at higher refresh rates. You definitely notice it fire up when you switch up into gaming mode. Which isn't so fantastic when the projector is placed right by your head.

Thankfully the keystone correction and vertical lens shift make it a lot easier to project from an angle other than directly in front of your chosen wall. Not only are there digital and physical controls for zoom, focus and lens shift, all are easy to set. It's important to note, however, that the keystone or "geometric correction" can't be changed once you've set the lens tilt. Moreover, in gaming mode, the keystone resets so you kinda do have to put it behind your head or get it rigged up to the ceiling if you want the full speed the beamer offers.

UHZ55 specs

Optoma UHZ55 gaming projector

(Image credit: Future)

Projector type: DLP
Lamp type: Laser
Resolution: 1080p
Image size: Up to 300-inch
Refresh rate: Up to 240Hz
Response time: 4ms
Throw ratio: 1.21:1 ~ 1.59:1 (Long throw)
Brightness: 3,000 ANSI lumens
Inputs: 1 x Ethernet , 1 x HDMI 2.0 (eARC), 1 x RS232 , 2x HDMI 2.0, 3 x USB Type-A, 1 x 3,5mm Jack, 1 x S/PDIF
Dimensions: 337 x 265 x 119.3mm
Features: Keystone correction, 1.3 Zoom, Creative Cast, Optoma Connect
Light lifespan: 30,000 hours
Warranty: 3 Year limited
Price: $2,099 | £1,498

The UI is easy to navigate, either with the controls on the top of the projector, or the little light up controller. But as much as the UI is gorgeous, the saddest thing about the Optoma UHZ55 is that it runs on a custom Android operating system. Aside from the few apps on the Optoma Marketplace, such as Netflix, TED, Prime, there's not a lot to choose from. Google Play services are completely off the cards, meaning no YouTube.

That could be a dealbreaker for some, but if you're always going to have your rig hooked up it shouldn't be too much of a problem as you'll have that for programs. Otherwise, you'll need to invest in an Amazon Fire Stick or similar, which means spending out on yet another device if you just want to catch up on your subscriptions without moving your gaming laptop, or whatever.

Hooking it up, it becomes clear where Optoma's focus was pointed. There's absolutely no noticeable latency from inputs with the Optoma UHZ55. Whether in 4K or 1080p in gaming mode, the movement is smooth as anything, and I could make the most of the super high frame rates with V-Sync on.

The image is damn sharp when you perfect the focus, too, even looking great with the curtains open on a sunny day thanks to all those lumens.

Image 1 of 5

Optoma UHZ55 gaming projector

(Image credit: Future)
Image 2 of 5

Optoma UHZ55 gaming projector

(Image credit: Future)
Image 3 of 5

Optoma UHZ55 gaming projector

(Image credit: Future)
Image 4 of 5

Optoma UHZ55 gaming projector

(Image credit: Future)
Image 5 of 5

Optoma UHZ55 gaming projector

(Image credit: Future)

But how does it fare against the rest of the best gaming projectors?

You can get the BenQ TK700STi—currently on our best projector list as the fastest at 240 Hz/4 ms—for $1,699/£1,340. That's a saving of $400/£160 over the Optoma, for essentially the same specs, and a speaker that's just as terrible. Of course, the BenQ's lamp life is rated to nowhere near as long as the Optoma UHZ55, which starts to look real good for the long haul and maybe worth spending the extra money on. 

Buy if...

You're a competitive gamer: If you've a need for speed, and your machine can make use of that 240 Hz refresh rate with high fps, the Optoma UHZ55 is one of the fastest beamers we've tested.

You have the space for long throw: The Optoma UHZ55 needs to be positioned between 1.2 and 8.1 m away from the projection surface, and close to straight on since the keystone doesn't work in gaming mode… make sure you have somewhere to put it. 

Don't buy if...

You're planning to use the built-in speaker: The speaker is weak and tinny to the point of being quite painful, especially if you're sitting right next to it. You will want to grab your own speaker setup for this one.

You won't always have a machine / firestick: Without Google Play, the Optoma UHZ55's app options are few. It's only good for Spotify, Netflix, TED, and Amazon Prime on its own. 

That said, compare it to the current top beamer on our list—the BenQ X3100I—and you start to see why speed isn't everything. The Optoma UHZ55 is almost £200 more ($800 for those in the US) for a higher refresh and half the input latency. So if you're packing a monster rig with one of the best graphics cards around, you'll likely want to make the most of the high refresh rate on the Optoma, but if you're barely hitting 120 fps at 1080p, there's not much need to spend the extra dollar. 

Sure you're sacrificing some speed with the Benq X3100I, but it's generally a more well-rounded projector, with the same lamp-life rating and fantastic image quality, but with 5W stereo speakers and an actual Android TV dongle backing it up.

Essentially, unless you already have one of the best speakers for PC gaming, a machine that can pump out 200+ fps—and an Android dongle/Fire Stick if said machine is a gaming laptop and not always around—it's a little harder to recommend the UHZ55. It's difficult to justify the extra cost against its (ever so slightly slower) competitors, plus all the extras you'll have to get hold of. But if you're all set with a surround sound system and a monster gaming PC, it'll eat through those frames like nobody's business. 

© Future

LG Tandem OLED display hits mass production — Dell XPS 13 is the super vibrant display's first design win

24. Červen 2024 v 16:36
LG has announced the successful beginning of mass production for its first Tandem OLED display. The 13-inch 2880x1800 pixel touchscreen display will see use in Dell XPS 13 laptops first, unique for its stacking of two light-emitting layers which together increase brightness and decrease power consumption.

© LG Display

  • ✇Rock, Paper, Shotgun
  • Should you bother with... ultrawide gaming monitors?James Archer
    I realised recently that a juicy subject for another Should You Bother With has been staring me in the face – or rather, I’ve been staring at it. Ultrawide gaming monitors have clearly avoided non-starter status, given they’ve been around for years, seemingly being exchanged for currency – and yet they’re nowhere near what you might consider the 'default' option when making a display upgrade. Regular widescreen monitors, with regular 16:9 aspect ratios, remain the go-to. So why switch? Read mo
     

Should you bother with... ultrawide gaming monitors?

20. Červen 2024 v 16:32

I realised recently that a juicy subject for another Should You Bother With has been staring me in the face – or rather, I’ve been staring at it. Ultrawide gaming monitors have clearly avoided non-starter status, given they’ve been around for years, seemingly being exchanged for currency – and yet they’re nowhere near what you might consider the 'default' option when making a display upgrade. Regular widescreen monitors, with regular 16:9 aspect ratios, remain the go-to. So why switch?

Read more

  • ✇Rock Paper Shotgun Latest Articles Feed
  • Should you bother with... ultrawide gaming monitors?James Archer
    I realised recently that a juicy subject for another Should You Bother With has been staring me in the face – or rather, I’ve been staring at it. Ultrawide gaming monitors have clearly avoided non-starter status, given they’ve been around for years, seemingly being exchanged for currency – and yet they’re nowhere near what you might consider the 'default' option when making a display upgrade. Regular widescreen monitors, with regular 16:9 aspect ratios, remain the go-to. So why switch? Read mo
     

Should you bother with... ultrawide gaming monitors?

I realised recently that a juicy subject for another Should You Bother With has been staring me in the face – or rather, I’ve been staring at it. Ultrawide gaming monitors have clearly avoided non-starter status, given they’ve been around for years, seemingly being exchanged for currency – and yet they’re nowhere near what you might consider the 'default' option when making a display upgrade. Regular widescreen monitors, with regular 16:9 aspect ratios, remain the go-to. So why switch?

Read more

  • ✇Ars Technica - All content
  • Micro LED monitors connect like puzzle pieces in HP multi-monitor conceptScharon Harding
    Enlarge / Yes, there are bigger monitors, but is there a better way to have a tri-monitor setup? (credit: Getty) In a technical disclosure published this month, HP explored a Micro LED monitor concept that would enable consumers to easily use various multi-monitor configurations through the use of "Lego-like building blocks." HP has no immediate plans to make what it has called "composable Micro LED monitors," but its discussion explores a potential way to simplify multitask
     

Micro LED monitors connect like puzzle pieces in HP multi-monitor concept

31. Květen 2024 v 22:32
woman using a tri-monitor setup

Enlarge / Yes, there are bigger monitors, but is there a better way to have a tri-monitor setup? (credit: Getty)

In a technical disclosure published this month, HP explored a Micro LED monitor concept that would enable consumers to easily use various multi-monitor configurations through the use of "Lego-like building blocks." HP has no immediate plans to make what it has called "composable Micro LED monitors," but its discussion explores a potential way to simplify multitasking with numerous displays.

HP's paper [PDF], written by HP scientists and technical architects, discusses a theoretical monitor that supports the easy addition of more flat or curved screens on its left, right, or bottom sides (the authors noted that top extensions could also be possible but they were "trying to keep the number of configurations manageable"). The setup would use one 12×12-inch "core" monitor that has a cable to the connected system. The computer's operating system (OS) would be able to view the display setup as one, two, or multiple monitors, and physical switches would let users quickly disable displays.

  • The illustration shows a monitor made of a core unit and two extension panels viewed as three monitors (left), two monitors (middle), and two monitors with different orientations (right). [credit: HP/Technical Disclosure Commons ]

Not a real product

HP's paper is only a technical disclosure, which companies often publish in order to support potential patent filings. So it's possible that we'll never see HP release "composable Micro LED monitors" as described. An HP spokesperson told me:

Read 15 remaining paragraphs | Comments

Dough Spectrum Black 27-inch OLED gaming monitor review: Pro-level accuracy and premium performance

19. Květen 2024 v 16:21
Dough sets a high standard with its Spectrum 27-inch OLED gaming monitor. It delivers HDR 400, 240 Hz, Adaptive-Sync and wide gamut color along with pro-level color accuracy and premium game performance.

© Tom's Hardware

  • ✇PCGamer latest
  • TCL smashes the refresh rate barrier as it demonstrates a 4K 1000Hz panel
    Gaming monitor technology continues to advance. When we see new models come to market with ever higher refresh rates, they're usually 1080p models. With the advent of OLED technology in particular, refresh rates are ticking upwards on higher resolution panels too. 1000Hz, though? That's never been seen on any gaming panel. Screens capable of 540Hz are on the market, but they're still rare, and very expensive 1080p TN panels aimed at competitive gamers. 4K example
     

TCL smashes the refresh rate barrier as it demonstrates a 4K 1000Hz panel

20. Květen 2024 v 09:24

Gaming monitor technology continues to advance. When we see new models come to market with ever higher refresh rates, they're usually 1080p models. With the advent of OLED technology in particular, refresh rates are ticking upwards on higher resolution panels too. 

1000Hz, though? That's never been seen on any gaming panel. Screens capable of 540Hz are on the market, but they're still rare, and very expensive 1080p TN panels aimed at competitive gamers. 4K examples such as the stunning Asus ROG Swift OLED PG32UCDM are capable of 240Hz, but that's still well short of the 1000Hz 4K panel spotted by Blur Busters at the DisplayWeek 2024 conference in California.

Surprisingly, TCL has offered next to no information on this panel, though Blur Busters say it's an LCD display. OLED is catching up fast, with dual-mode 480Hz offerings from Asus and LG coming to market in 2024, and 1000Hz prototypes possible within the next couple of years.

While this panel looks mightily impressive on paper, there are some caveats. The screen appears as though it's very much a proof-of-concept, with no information on what, if any compromises needed to be made in order to achieve such a refresh rate.

Screen queens

(Image credit: Future)

Best gaming monitor: Pixel-perfect panels for your PC.
Best high refresh rate monitor: Screaming quick.
Best 4K monitor for gaming: When only high-res will do.
Best 4K TV for gaming: Big-screen 4K gaming.

Then there's the question of whether DisplayPort 2.1 with Display Stream Compression (DSC) is capable of driving such a panel, meaning a future DP 3.0 or HDMI 3.0 standard may be required to support such a refresh rate without crazy levels of compression or loss of color depth. And, you'll need a powerful graphics card and/or CPU to get anywhere near those kinds of frame rates. Some kind of next gen frame generation technology may be a prerequisite.

So, while this is an impressive feat by TCL, it's the kind of thing that'll have limited practicality for the foreseeable future. 1000Hz is more likely to debut at 1080p or 1440p resolutions before we see 1000Hz at 4K.

Beyond gaming, I look forward to seeing what 1000Hz content would look like. Planet Earth 4 perhaps? By the time displays supporting 1000Hz at 4K become a thing, Sir David Attenborough will probably be 120 years old, and we'll be eyeing the next big thing. I want my wall sized 16K 1000Hz window replacement panel. 

In the meantime, do check out our picks for best high refresh rate gaming monitor in 2024. You won't even need a 2030 graphics card to run any of these.

© Blur Busters

  • ✇PCGamer latest
  • MSI's new QD-OLED gaming monitors are a feast for the eyes
    Popping colours, refresh rates that push the limits of what the human eye can actually perceive, and deep blacks that really make you feel the cosmic void when playing a space sim. A powerful monitor is a key component of a serious PC gamer's arsenal, and hardware manufacturer MSI has been dutifully creating them for decades now, ensuring you're always getting the best visual experience when settling in for a session.Now, MSI has released its latest line of flags
     

MSI's new QD-OLED gaming monitors are a feast for the eyes

17. Květen 2024 v 16:16

Popping colours, refresh rates that push the limits of what the human eye can actually perceive, and deep blacks that really make you feel the cosmic void when playing a space sim. A powerful monitor is a key component of a serious PC gamer's arsenal, and hardware manufacturer MSI has been dutifully creating them for decades now, ensuring you're always getting the best visual experience when settling in for a session.

Now, MSI has released its latest line of flagship QD-OLED gaming monitors for 2024, but before introducing the contenders, let's look at the technologies and designs that make these monitors so special.

It's no secret that OLED panels offer the best visual quality on the market, with fast response times (0.03ms for all QD-OLED panels), wide viewing angles, and those famous contrast ratios, where each pixel is capable of emitting 'true black,' making for a vibrant image that you can only fully appreciate when it's right there in front of you. All new MSI monitors with QD-OLED panel now use DisplayHDR True Black 400—one of the latest iterations of HDR technology which features global dimming, peak luminance of 400 cd/m², and true 8-bit image quality.

MSI QD-OLED monitors scenario photo

The 49" MPG491CQP QD-OLED Gaming Monitor  (Image credit: MSI)

MSI's QD-OLED range enlists the power of AI too, with AI Vision revealing details in dark areas and saturating colours beautifully. MSI Gaming Intelligence, meanwhile, has a bundle of features to further optimise your experience. Night Vision, for example, reveals the dark areas in the corner, while Optix Scope adds a hardware level zoom to any weapon in your game. Smart Crosshair is also a useful tool to add a crosshair in any game and changes its colour automatically for maximum contrast with the background so you always have a clear aim. 

If you are looking for a monitor for your console, MSI QD-OLED monitors can also be a good choice thanks to the increased bandwidth of HDMI 2.1. Auto Low Latency Mode support makes these monitors especially powerful for those looking for an edge in online and competitive games. 

One of the historical drawbacks of OLED has been burn-in, where the panel degrades over time due to the heat emitted from static images on the screen. MSI's QD-OLED panels offset this with a Graphene film and customised heatsink design, which together disperse the heat and extend the panel's lifespan. OLED CARE 2.0 adds to its predecessor's roster of techs like Pixel Shift and Static Screen Detection with new tricks like Multi Logo Detection, Taskbar Detection, and Boundary Detection, all minimising the risk of burn-in.

MSI QD-OLED monitors scenario photo 2

(Image credit: MSI)

And for that sweet, sweet peace of mind, each panel comes with a 3-Year Burn-in Warranty too.

You can find out more about MSI QD-OLED panels right here, but for now let's move onto the actual monitors, shall we? 

MAG 341CQP QD-OLED (£999)

MSI MAG 341CQP Gaming Monitor

The MSI MAG 341CQP Gaming Monitor (Image credit: MSI)

This 34" Ultrawide QHD (3440x1440) display features DisplayHDR True Black 400, an impressive 1500000:1 contrast ratio, 175 Hz refresh rate, and VESA Certified ClearMR 9000, ensuring crystal-clear images with minimal motion blur.

MPG 321URX QD-OLED (£1299)

MSI QD-OLED MPG 321URX gaming monitor

The MPG 321URX QD-OLED Gaming Monitor (Image credit: MSI)

The 32" MPG 321URX QD-OLED  delivers a pure 4K (3840x2160) UHD experience, capping out at 240Hz, which even the most powerful graphics cards will have their work cut out in maximising. If you are looking for a 4K monitor to get the best out of your high end graphics card, you have a great option here. 

MPG 271QRX QD-OLED (£999)

MSI 271QRX

(Image credit: MSI)

This 27" WQHD 2K (2560x1440) display comes with a searing 360 Hz refresh rate, DisplayHDR TrueBlack 400, and is VESA Certified ClearMR 13000, which is the highest motion range (and lowest blur) rating a monitor can have. With high refresh rate and low latency, the MPG 271QRX QD-OLED is also a good option for competitive esports players.

MPG 491CQP QD-OLED (£1299)

MSI 491CQP

(Image credit: MSI)

Want a huge, ultrawide screen that will expand your experience all the way into your peripheral? Then this one's for you. With a DQHD (5120x1440) resolution (that's a 32:9 aspect ratio) and 49" of screen real estate, this panel is a powerhouse for both gaming and work. It's endowed with all the perks of other QD-OLED displays, with a searing 144Hz refresh rate that's more than enough for most modern gaming needs.

Would you like to know more? Find out about the availability of these monitors at the MSI website. You can also follow what's happening in the MSI labs on Facebook, Instagram, and X (Twitter).

© MSI

  • ✇Eurogamer.net
  • MSI MPG 321URX review: the best QD-OLED monitor for US buyersReece Bithrey
    We're in the middle of a bit of a monitor revolution that comes around once every few years, as new panel types take hold and wow the crowds. That's certainly the case in 2024 with a slew of new monitors based around Samsung's third-gen QD-OLED panels. We've been focusing on the 32-inch 4K 240Hz offerings, which include the sublime Asus ROG Swift PG32UCDM, which we lauded as the best gaming monitor we've ever tested, while the Alienware AW3225QF is the best value option for UK buyers. Since the
     

MSI MPG 321URX review: the best QD-OLED monitor for US buyers

30. Duben 2024 v 09:00

We're in the middle of a bit of a monitor revolution that comes around once every few years, as new panel types take hold and wow the crowds. That's certainly the case in 2024 with a slew of new monitors based around Samsung's third-gen QD-OLED panels.

We've been focusing on the 32-inch 4K 240Hz offerings, which include the sublime Asus ROG Swift PG32UCDM, which we lauded as the best gaming monitor we've ever tested, while the Alienware AW3225QF is the best value option for UK buyers.

Since then, I've been testing MSI's competitor to both of these, the catchily-named MPG 321URX, and it's also a fantastic gaming monitor - and one that looks like the best option for those in the US, given a lower introductory price on the opposite side of the Atlantic. But how does it compare to the Dell and Asus models we've tested already?

Read more

  • ✇XDA
  • 10 ways a gaming PC is different from your regular PCTanveer Singh
    If you haven't been inducted into the PC gaming community yet, you might have some doubts about what exactly constitutes "gaming PCs". After all, isn't every PC fundamentally the same? Well, in many ways, yes. But, there are also a lot of differences between gaming PCs and regular PCs, right from the kind of components used to the considerations involved when building a gaming PC.
     

10 ways a gaming PC is different from your regular PC

22. Duben 2024 v 17:00

If you haven't been inducted into the PC gaming community yet, you might have some doubts about what exactly constitutes "gaming PCs". After all, isn't every PC fundamentally the same? Well, in many ways, yes. But, there are also a lot of differences between gaming PCs and regular PCs, right from the kind of components used to the considerations involved when building a gaming PC.

  • ✇Ars Technica - All content
  • Meet QDEL, the backlight-less display tech that could replace OLED in premium TVsScharon Harding
    Enlarge (credit: Getty) What comes after OLED? With OLED-equipped TVs, monitors, and other gadgets slowly becoming more readily available at lower prices, attention is turning to what the next landmark consumer display tech will be. Micro LED often features in such discussions, but the tech is not expected to start hitting consumer devices until the 2030s. Display makers are also playing with other futuristic ideas, like transparent and foldable screens. But when it comes to
     

Meet QDEL, the backlight-less display tech that could replace OLED in premium TVs

22. Duben 2024 v 13:00
Viles of quantum dots

Enlarge (credit: Getty)

What comes after OLED?

With OLED-equipped TVs, monitors, and other gadgets slowly becoming more readily available at lower prices, attention is turning to what the next landmark consumer display tech will be.

Micro LED often features in such discussions, but the tech is not expected to start hitting consumer devices until the 2030s. Display makers are also playing with other futuristic ideas, like transparent and foldable screens. But when it comes to technology that could seriously address top user concerns—like image quality, price, and longevity—quantum dots seem the most pertinent at the moment.

Read 42 remaining paragraphs | Comments

  • ✇XDA
  • Best 4K monitors for MacBook Pro in 2024Lucas Coll
    The MacBook Pro (M3, 2023) is shaping up to be Apple's best laptop yet, and the best MacBook deserves the best 4K monitor to take advantage of the powerful new M3 chipset. Hooking up your MacBook Pro to an external display lets you turn your laptop into a full-fledged Mac desktop. The new MacBook Pro also has an HDMI port along with USB-C, making it compatible with the vast majority of 4K monitors on the market today. You don't have to sort through them all to find the right one for you, though.
     

Best 4K monitors for MacBook Pro in 2024

9. Březen 2024 v 00:00

The MacBook Pro (M3, 2023) is shaping up to be Apple's best laptop yet, and the best MacBook deserves the best 4K monitor to take advantage of the powerful new M3 chipset. Hooking up your MacBook Pro to an external display lets you turn your laptop into a full-fledged Mac desktop. The new MacBook Pro also has an HDMI port along with USB-C, making it compatible with the vast majority of 4K monitors on the market today. You don't have to sort through them all to find the right one for you, though. We've already done the lifting to round up the best 4K monitors for the MacBook Pro right here.

The 16 best monitors for 2024

The sheer number of computer monitors on the market is enough to intimidate anyone lookin to add one to their workspace or gaming rig. Plus, the technology has evolved rapidly even in just the past year alone with OLED Flex, QD-OLED and built-in smart platforms becoming more prevalent in new monitors. That’s on top of big improvements in things like color accuracy, image quality, size and resolution as well. But fear not, Engadget can help you make sense of the computer monitor space and help you decide which monitors (or, at the very least, type of monitor) is right for you. Whether you’re a business user, a content creator, a multitasker or into competitive gaming, you have plenty of options to choose from and we’ve outline our top picks below.

Factors to consider

Panel type

The cheapest monitors are still TN (twisted nematic), which are strictly for gamers or office use. VA (vertical alignment) monitors are also relatively cheap, while offering good brightness and a high contrast ratio. However, content creators will find that IPS (in-plane switching) LCD displays deliver better color accuracy, picture quality and viewing angles.

If maximum brightness is important, a quantum dot LCD display is the way to go — those are typically found in larger displays. OLED monitors are now available and offer the best blacks and color reproduction, but they lack the brightness of LED or quantum dot displays. Plus, they cost a lot. The latest type of OLED monitor, called QD-OLED from Samsung, just came out in 2022. The most notable advantage is that it can get a lot brighter, with monitors shown at CES 2022 hitting up to 1,000 nits of peak brightness.

MiniLEDs are now widely used in high-end displays. They’re similar to quantum dot tech, but as the name suggests, it uses smaller LED diodes that are just 0.2mm in diameter. As such, manufacturers can pack in up to three times more LEDs with more local dimming zones, delivering deeper blacks and better contrast.

Screen size, resolution and display format

In this day and age, screen size rules. Where 24-inch displays used to be more or less standard (and can still be useful for basic computing), 27-, 32-, 34- and even 42-inch displays have become popular for entertainment, content creation and even gaming these days.

Nearly every monitor used to be 16:9, but it’s now possible to find 16:10 and other more exotic display shapes. On the gaming and entertainment side, we’re also seeing curved and ultrawide monitors with aspect ratios like 21:9. If you do decide to buy an ultrawide display, however, keep in mind that a 30-inch 21:9 model is the same height as a 24-inch monitor, so you might end up with a smaller display than you expected. As a rule of thumb, add 25 percent to the size of a 21:9 monitor to get the vertical height you’d expect from a model with a 16:9 aspect ratio.

A 4K monitor is nearly a must for content creators, and some folks are even going for 5K or all the way up to 8K. Keep in mind, though, that you’ll need a pretty powerful computer to drive all those sharp pixels. And 4K resolution should be paired with a screen size of 27 inches and up, or you won’t notice much difference between 1440p. At the same time, I wouldn’t get a model larger than 27 inches unless it’s 4K, as you’ll start to see pixelation if you’re working up close to the display.

One new category to consider is portable monitors designed to be carried and used with laptops. Those typically come in 1080p resolutions and sizes from 13-15 inches. They usually have a lightweight kickstand-type support that folds up to keep things compact.

Samsung Smart Monitor M5
Samsung

HDR

HDR is the buzzy monitor feature to have these days, as it adds vibrancy to entertainment and gaming – but be careful before jumping in. Some monitors that claim HDR on the marketing materials don’t even conform to a base standard. To be sure that a display at least meets minimum HDR specs, you’ll want to choose one with a DisplayHDR rating with each tier representing maximum brightness in nits.

However, the lowest DisplayHDR 400 and 500 tiers may disappoint you with a lack of brightness, washed out blacks and mediocre color reproduction. If you can afford it, the best monitor to choose is a model with DisplayHDR 600, 1000 or True Black 400, True Black 500 and True Black 600. The True Black settings are designed primarily for OLED models, with maximum black levels at .0005 nits.

Where televisions typically offer HDR10 and Dolby Vision or HDR10+, most PC monitors only support the HDR10 standard, other than a few (very expensive) models. That doesn’t matter much for content creation or gaming, but HDR streaming on Netflix, Amazon Prime Video and other services won’t look quite as punchy. In addition, the best gaming monitors are usually the ones supporting HDR600 (and up), rather than content creation monitors – with a few exceptions. 

Refresh rate

Refresh rate is a key feature, particularly on gaming monitors. A bare minimum nowadays is 60Hz, and 80Hz and higher refresh rates are much easier on the eyes. However, most 4K displays top out at 60Hz with some rare exceptions and the HDMI 2.0 spec only supports 4K at 60Hz, so you’d need at least DisplayPort 1.4 (4K at 120Hz) or HDMI 2.1. The latter is now available on a number of monitors, particularly gaming displays. However, it’s only supported on the latest NVIDIA RTX 3000- and 4000-series, AMD RX 6000-series GPUs.

Inputs

There are essentially three types of modern display inputs: Thunderbolt, DisplayPort and HDMI. Most monitors built for PCs come with the latter two, while a select few (typically built for Macs) will use Thunderbolt. To add to the confusion, USB-C ports may be Thunderbolt 3 and by extension, DisplayPort compatible, so you may need a USB-C to Thunderbolt or DisplayPort cable adapter depending on your display.

Color bit depth

Serious content creators should consider a more costly 10-bit monitor that can display billions of colors. If budget is an issue, you can go for an 8-bit panel that can fake billions of colors via dithering (often spec’d as “8-bit + FRC”). For entertainment or business purposes, a regular 8-bit monitor that can display millions of colors will be fine.

Color gamut

The other aspect of color is the gamut. That expresses the range of colors that can be reproduced and not just the number of colors. Most good monitors these days can cover the sRGB and Rec.709 gamuts (designed for photos and video respectively). For more demanding work, though, you’ll want one that can reproduce more demanding modern gamuts like AdobeRGB, DCI-P3 and Rec.2020 gamuts, which encompass a wider range of colors. The latter two are often used for film projection and HDR, respectively.

Console gaming

Both the Xbox Series X and Sony’s PS5 can handle 4K 120Hz HDR gaming, so if you’re into resolution over pure speed, you’ll want a monitor that can keep up and provide the best gaming experience possible. 4K resolution, HDR and at least 120Hz is the minimum starting point, but fortunately there are 27-inch displays with those specs starting at well under $1,000.

Pricing and parts shortages

Though the pandemic has eased, monitor supply is still a bit tighter than pre-pandemic levels due to supply and demand issues. To that end, you may have trouble finding monitors at Amazon, B&H or elsewhere for the suggested retail price point. For our guide below, we’re basing our picks on the MSRP, as long as the street price doesn’t exceed that by more than $25.

Best monitors under $200

Best monitors under $400

Best monitors under $500

Best monitors under $1,000

This article originally appeared on Engadget at https://www.engadget.com/how-to-buy-a-monitor-143000069.html?src=rss

© ASUS

ASUS ProArt Display PA27UCX-K monitor in a video editing setup.
  • ✇Ars Technica - All content
  • AMD stops certifying monitors, TVs under 144 Hz for FreeSyncScharon Harding
    Enlarge / AMD's depiction of a game playing without FreeSync (left) and with FreeSync (right). (credit: AMD) AMD announced this week that it has ceased FreeSync certification for monitors or TVs whose maximum refresh rates are under 144 Hz. Previously, FreeSync monitors and TVs could have refresh rates as low as 60 Hz, allowing for screens with lower price tags and ones not targeted at serious gaming to carry the variable refresh-rate technology. AMD also boosted the refresh-
     

AMD stops certifying monitors, TVs under 144 Hz for FreeSync

8. Březen 2024 v 21:35
AMD's depiction of a game playing without FreeSync (left) and with FreeSync (right).

Enlarge / AMD's depiction of a game playing without FreeSync (left) and with FreeSync (right). (credit: AMD)

AMD announced this week that it has ceased FreeSync certification for monitors or TVs whose maximum refresh rates are under 144 Hz. Previously, FreeSync monitors and TVs could have refresh rates as low as 60 Hz, allowing for screens with lower price tags and ones not targeted at serious gaming to carry the variable refresh-rate technology.

AMD also boosted the refresh-rate requirements for its higher AdaptiveSync tiers, FreeSync Premium and FreeSync Premium Pro, from 120 Hz to 200 Hz.

Here are the new minimum refresh-rate requirements for FreeSync, which haven't changed for laptops.

Read 7 remaining paragraphs | Comments

Discover Samsung Early Access event brings special discounts on the brand's best monitors

2. Březen 2024 v 14:30

Although the Discover Samsung sales event doesn't officially start until next week, we're getting a special sneak peek at some of the promotions, with exclusive deals on some of the brand's best monitors. For an extremely limited time, you can now save up to $700 on Samsung's best ultrawide monitors. Of course, you'll want to be quick because this sales event ends on March 3.

  • ✇Rock Paper Shotgun Latest Articles Feed
  • This 45-inch Lenovo Legion gaming monitor is down to £699 at VeryWill Judd
    Today's my last day writing deals at RPS! That's a shame, but I have still have two to four articles to share with you, and I'm going to start with this ultrawide monitor from Lenovo, the Legion R45W-30. This is a huge 45-incher, offering what is essentially two 1440p 165Hz screens side-by-side. It normally retails for £799, but today Very are offering it for £100 off - £699. That's a good price for what is essentially two mid-range gaming monitors seamlessly joined into one. Read more
     

This 45-inch Lenovo Legion gaming monitor is down to £699 at Very

Today's my last day writing deals at RPS! That's a shame, but I have still have two to four articles to share with you, and I'm going to start with this ultrawide monitor from Lenovo, the Legion R45W-30. This is a huge 45-incher, offering what is essentially two 1440p 165Hz screens side-by-side. It normally retails for £799, but today Very are offering it for £100 off - £699. That's a good price for what is essentially two mid-range gaming monitors seamlessly joined into one.

Read more

This 1440p 240Hz HP Omen monitor costs just $299.99 in the US after a $130 discount

If you like to play competitive games that benefit from a higher frame-rate and refresh rate, then this 1440p 240Hz monitor for $300 in Best Buy US is well worth knowing about. This is an HP OMen 27qs to be exact, a well-regarded Fast IPS model that combines good all-around characteristics with excellent motion handling and that high refresh rate - now discounted by $130.

Read more

  • ✇Android Authority
  • Huge $600 savings on the 2023 Samsung 49-inch OLED gaming monitorMatt Horne
    Your gaming rig deserves the best, and displays don’t come much better than the Samsung 49-inch Odyssey G9 OLED curved gaming monitor. The stunning 2023 display has an on on-page coupon available on Amazon right now that drops the price to $999.99. That’s $600 off, and a price only previously seen on Black Friday. Samsung 49-inch Odyssey G9 OLED Curved Gaming Monitor for $999.99 ($600 off)
     

Huge $600 savings on the 2023 Samsung 49-inch OLED gaming monitor

16. Únor 2024 v 18:57

Your gaming rig deserves the best, and displays don’t come much better than the Samsung 49-inch Odyssey G9 OLED curved gaming monitor. The stunning 2023 display has an on on-page coupon available on Amazon right now that drops the price to $999.99. That’s $600 off, and a price only previously seen on Black Friday.

Samsung 49-inch Odyssey G9 OLED Curved Gaming Monitor for $999.99 ($600 off)

  • ✇XDA
  • Best OLED monitors in 2024Cale Hunt
    The best OLED monitors will change the way you view your digital world for the better, whether it involves gaming, creativity, development, or other professional work. Organic Light Emitting Diode (OLED) screens offer truly deep contrast thanks to their ability to turn off individual LEDs, as well as plenty of brightness without the need for a separate backlight. They have the low response time and high refresh rate that PC gamers crave, and they can produce accurate color across gamuts for prof
     

Best OLED monitors in 2024

Od: Cale Hunt
21. Únor 2024 v 01:00

The best OLED monitors will change the way you view your digital world for the better, whether it involves gaming, creativity, development, or other professional work. Organic Light Emitting Diode (OLED) screens offer truly deep contrast thanks to their ability to turn off individual LEDs, as well as plenty of brightness without the need for a separate backlight. They have the low response time and high refresh rate that PC gamers crave, and they can produce accurate color across gamuts for professional purposes.

❌
❌