All posts by RejZoR

About RejZoR

Beeeeeeeeeeeee!

Alien franchise is back with Alien:Romulus!

I know I haven’t posted for a while and I usually don’t post about movies, but this dropped today totally unexpected and as huge Alien fan, I have to talk about it. Because it looks freaking amazing! I know this is just a teaser trailer and there is plenty to f**k up in a full movie, but after awkward Prometheus and boring and confusing Covenant, from the looks of it, we’re back to roots with proper full on Alien movie and this one looks absolutely proper. Space? Check. Space station? Check. Small group of people? Check. Aliens? Check. Badass female protagonist? From the looks of it, check as well. It has all the good stuff and how they presented it in these few seconds was really awesome. Alien:Romulus is coming on 16th August 2024. How we haven’t heard about this before? This is just 4 months away! I freaking can’t wait!

ASUS falsely advertising their OLED monitor ROG Swift PG27AQDM as G-Sync Compatible even though it’s not

I got myself an OLED monitor and ASUS ROG Swift PG27AQDM checked all the tickboxes for me. Until recently I found people mentioning how this monitor is tearing like crazy at lower framerates and that made me curious. I didn’t notice it because I’m currently playing older titles that run at locked 240fps on my RTX 3080. But I decided to inspect things and noticed that ASUS is indeed lying to us and I think it’s really scummy thing to do. Especially since ASUS said “we’re sorry” several times for same things and then fucking does it again. What the hell ASUS?

So, lets begin with evidence…

G-Sync Compatible claim in the main description of the product taken from ASUS webpage…

ASUS_PG27AQDM_GSYNC2

Scrolling further down the product webpage and they brag about G-Sync once again.

ASUS_PG27AQDM_GSYNC

Then there is further description in the detailed tech specs where ASUS claims PG27AQDM has G-Sync Compatible certification.

ASUS_PG27AQDM_Certification

Now to the evidence that shows how ASUS is misrepresenting the product and stating things about it that aren’t true.

NVIDIA_GSYNC_CERTIFIED_LIST

Image taken from NVIDIA webpage with entire list of G-Sync compatible monitors (dated 2024-01-13):

https://www.nvidia.com/en-us/geforce/products/g-sync-monitors/specs/

Notice how instead of stating which drivers support this monitor, it just says “Future”? Yeah, that’s a thing. I bought the monitor NOW because it claimed it’s G-Sync compatible. Because I hate image tearing and this was like No.1 thing I cared about. ASUS claimed it supported it all. Now NVIDIA itself who certifies monitors lists it as “Unsupported” and with support coming in the “Future”. When that might be, no one knows.

Further, this is what NVIDIA Control Panel running latest NVIDIA drivers version 546.33 states about PG27AQDM monitor on my system…

NVIDIA_CONTROLPANEL_GSYNC

ASUS, care to explain why are you claiming this monitor is G-SYNC Compatible when it CLEARLY isn’t? What the actual fuck? This isn’t a 150€ monitor and even there false claims would be unacceptable, this thing costs over 1000€ here in Europe. And while it’s not the most expensive monitor out there, I think you can understand the outrage when you pay fucking 4 figure price in € and it doesn’t have one of the primary features required for any gaming monitor to be good. And that’s having a fucking functioning Variable Refresh Rate support just like you claim on product description page.

Unfortunate thing is that I’m too small of a fish to make any kind of concern to ASUS with my rant on a small personal blog, but I sure hope someone big like JayzTwoCents, Hardware Unboxed or Linus Tech Tips would pick up on this. Because then, ASUS would quickly become “aware” and “concerned” about the issue and resolve it quickly. Instead, god knows how long we’ll be waiting for the “Future” to come and actually support G-Sync the way they claim it’s suppose to.

 

Buying LEGO as adult…

20231220_110255

So, I guess buying LEGO as adult looks something like this hehe. It has been like 2 decades since I last bought LEGO for myself (buying it for cousins and nephews doesn’t quite count) and I thought I should treat myself with this one. And I freaking love it. Was expecting the pot part to be simple assembly, but just the pot itself has like 400 pieces and it’s insanely complex. I wonder if someone at LEGO literally has a job of making this stuff up and figuring out how to do it with the blocks they already make…

It actually looks really neat and from a distance, it looks like an actual plant. And I love it how LEGO reuses their existing parts to make up new stuff. Like, for example, the orchid’s breathing tube/roots coming out of the pot are parts used for LEGO dinosaurs. Or the core of the orchid blossom is a tiny pink frog piece. They even mention it in the assembly manual lol. It’s a cute and cheeky touch and I like it.

I’m actually really amazed how a non-tech toy company just endlessly keeps innovating new stuff and attracting new crowds in this age of everything digital. I know they have been making motorized Technics and Mindstorm models for decades now, but just plain LEGO blocks are still their core product.

Anyway, I think I might buy myself Bonsai tree next, should fit nicely on the other end of the desk. I guess next logical after the Bonsai tree would be… Imperial Star Destroyer. Right? 😀

Galaxy Watch 4 caller ID broken since latest update based on WearOS 4

Oh come on Samsung, it has been almost 2 month since you released big update for Galaxy Watch 4 based on WearOS 4 and it’s still broken. Ever since the update, my Galaxy Watch 4 smartwatch doesn’t show who the fuck is calling me or sending me a message. The person is in my Contacts list and all I see is their phone number when I get the call. Then I take out my phone and I see it’s just my dad calling me. I’ve found bunch of people having same issue, yet somehow this hasn’t come to Samsung’s attention yet somehow.

What’s even more annoying in this situation is that getting in touch with Samsung in ANY way is so god damn difficult. Trying to just contacting them keeps you spinning in circles on their webpage. Literally. Then I contacted them through Samsung Members app and all I got was some sort of generic reassurance they’ll look into it. Then I contacted my local Samsung service center and they want me to bring the watch into service. I DON’T WANT TO BE WITHOUT A WATCH FOR A FUCKIN MONTH?! Fix the fucking problem. It worked fine till the update, after the update, no caller ID anymore. It’s not rocket science.

It’s so weird, when I had iPhone and Apple Watch, it was easier to contact Apple than it’s Samsung. And they actually fixed the reported bug in 1 month time. It was a minor bug so 1 month is actually pretty respectable. 2 months of having watch that doesn’t do its main god damn job is fucking infuriating.

Windows 11 Attestation finally fixed with Ryzen 5800X3D

Now, I don’t fully know what caused this, Microsoft, AMD or ASUS, but in my case on ASUS Strix X570-E motherboard with Ryzen 5800X3D, Windows 11 kept “complaining” about TPM security feature called “Attestation” being “Not supported” even though 5800X (regular non 3D) that I had before, supported it. Well, today I got notification that I have set up for my motherboard BIOS that new version is available. Okay, I check the ASUS support page and there is indeed a new BIOS update version 5003 for ASUS Strix X570-E which mentions firmware update for TPM subsystem. And that instantly got me thinking, is this it? So I made a screenshot of before and after update. And indeed, Attestation is now “Ready” the way it should be. I was also having issues with TPM just randomly not being detected in Windows 11 and after reboot, it was there. We’ll see if that was also fixed with this BIOS update. Hopefully, it is…

BIOS 4802

TPM_Original

 

BIOS 5003

TPM_New

As you can see, the TPM firmware version now has newer date and Attestation status is now also “Ready”.

Since this is AGESA update, it means that this isn’t ASUS exclusive fix, it was sent out by AMD. Meaning if you’re experiencing similar issue and you understand why this feature is there, check out for latest BIOS update for your motherboard and update it if required. BIOS updates are good to have but usually not necessary and can brick your system so only do it if you understand how and what the risks are. Just a disclaimer.

Starfield Tweaker 1.0 released!

StarfieldLogo

I’ve been playing Starfield since early release because I was so hooked into the hype and I absolutely love the game. What I don’t love so much is lack of certain control over the game. Specifically lack of FOV (Field of View) settings. But no more! Just like for Killing Floor 2 game, I’ve now created Starfield Tweaker, a tiny portable tool that allows you to adjust certain hidden or unavailable settings in Starfield game. Head to my Starfield Tweaker micro page down below and see what tweaks I’ve included!

Starfield Tweaker is now also added in “My Projects” menu above on my blog page where you can always find it directly.

VISIT MY STARFIELD TWEAKER MICRO PAGE

Why is state of Bluetooth codecs on Windows so awful?!

I’ve been using bluetooth earbuds on my Windows PC for a very long time now, also for gaming because I don’t have any wire bothering me. Been using Apple’s AirPods 2 and they worked great, they even reproduce enough bass to be engaging for all the explosions and music easily. But recently I’ve upgraded to highly acclaimed SoundPeats Air3 Deluxe HS that get very high remarks almost everywhere and they are very accessible because they don’t cost 300€, yet boast some pretty impressive specs like very good audio profile, frequency response up to 40kHz which is rather rare since most go to 20kHz only and they also support AptX and LDAC which aren’t supported on AirPods 2.

And here comes the problem stated in the title of this article. Windows, for whatever dumb reason just pick codecs almost on random at this point. Most bluetooth audio devices these days support SBC and AAC codecs, of which AAC is superior. However, there are also AptX from Qualcomm and LDAC from Sony, both superior to AAC and especially SBC which just sounds horrendous in comparison.

What I don’t understand is why Windows 11 offers ABSOLUTELY NO control over bluetooth codec selection? Whole bluetooth headphones industry exploded and almost everything these days is bluetooth. Sure, audio purists will insist on wires and on PC I’m one of those, but I can’t deny convenience of wireless so I do use it for evenings. But I’m stuck with AAC with absolutely no way in changing that with Windows 11 itself. There is a tweak that can force disable AAC codec in Windows 11 using this registry modificiation:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\BthA2dp\Parameters]

“BluetoothAacEnable”=dword:00000000

However in my case, Windows just defaults to SBC codec instead of picking up on AptX, making everything sound like hot garbage. And yes, while LDAC isn’t supported by Windows 11, AptX absolutely is, yet you can’t frigging enable it in any way. Why Microsoft? WHY?

I did find way to use LDAC with this Alternative A2DP Driver tool:

https://www.bluetoothgoodies.com/a2dp/

Unfortunately, it’s not free, although it’s quite cheap so if you want to use LDAC, you may want to invest in it because Microsoft sure doesn’t give two flying fucks about state of bluetooth codecs. One would expect a simple, easy way to select codecs in sound settings in Windows and yet you just can’t do it. Ridiculous. Fix this crap Microsoft, it’s embarrassing.

Same company that’s providing the alternate bluetooth driver also offers Bluetooth Tweaker tool which you can use to check which codec is Windows using in your case and also a tool to confirm the above alternate driver is working correctly after enabling LDAC.

https://www.bluetoothgoodies.com/tweaker/

This tool is also available for a small fee and of course, you need headphones that support LDAC. You’ll have to check product specs to see if yours do. Cheap ones usually only support AAC while most higher end ones also support LDAC. Both tools have 7 days free trial so you can check things and test it out for few days before you commit to it or not. But I still hope Microsoft will get of their ass and add proper support for codec selection in Windows 11.

EDIT:

Turns out even Linux has Bluetooth in better state. Using Pipewire Audio subsystem and you can use any codec in a matter of 2 clicks. Not only I could use AptX easily on my laptop paired with SounPeats TrueAir 2, I could even select SBC XQ on the other where I had AirPods 2 paired with it. AirPods only support SBC and AAC, no LDAC or AptX, but still, seeing SBC with higher quality profile (XQ probably stands for Xtreme Quality) which sounds way better.

So, again, Microsoft, get your head out of your own ass. First you nuked hardware audio acceleration that was really amazing before Windows Vista and now you’re lagging even behind Linux with bluetooth audio? How pathetic is that?

GeForce RTX 4060Ti is a pointless product and here’s why

So, this won’t be my typical Angry Rant exactly, but I talked with someone in some news comment section about RTX 4060Ti and realized how pointless this product really is and wanted to spill it out here on my blog…

Think about it. NVIDIA released RTX 4060Ti almost 3 years after RTX 3060Ti. Performance metrics wise, it performs basically the same as RTX 3060Ti with slightly better power efficiency. Sure, it has DLSS3 Frame Generation thing, but hardly any game supports that and you’re buying something on what it offers NOW, not what it might offer in the future. DLSS upscaling and Frame Generation is something you should look at as “nice to have”, but not something I’d base my purchase on. On top of that, Frame Generation won’t appear in old games that are not actively supported by devs so you’re not gaining anything there either. You need raw horsepower to improve that and RTX 4060Ti just doesn’t have it.

So, lets look at who is being targeted with RTX 4060Ti, what kind of customer are they targeting with it. If things were normal, new generations always brought generational general performance improvements. This way RTX 4060Ti would appeal to existing users of RTX 3060Ti, RTX 2060 Super, GTX 1650 as well as GTX 1060. If RTX 4060Ti was at least 25% faster than RTX 3060Ti and also more power efficient and also offer Frame Generation tech, I’d actually be like, alright, it’s expensive, but at least it has some substance. It’s faster and also more efficient. If you want to upgrade, why not. But that’s not the case, RTX 4060Ti is performing basically the same as RTX 3060Ti. People raving about power efficiency, why? Who gives a shit? Power efficiency matters when you gain performance out of it. What good is 100W lower power consumption when you don’t gain any framerate out of it? Power consumption doesn’t run your games and if games still run like crap, what’s the point? I don’t get it. This would be a good point if we were talking about a smartphone with finite battery life. This is a desktop card. Performance is all it matters, within certain power envelope of course, but even RTX 3060Ti wasn’t some insane power hungry card. And if you worry about power bill so hard, then you’re in a wrong hobby. That’s like being a petrolhead and worrying about every single cent you spend on petrol. That’s just terrible mentality to have. Performance is the top metric you should care about with graphic cards. It’s what will give you good gaming experience and elevate the value of the game, either through smoothness of play or eye candy you can turn on. Power efficiency gives you none.

Ok, so now we’ve established that RTX 4060Ti is pointless if you already own RTX 3060Ti. So, lets look further down the product stack of ownership if it makes sense there. You’re user with RTX 2060 or even GTX 1650 or GTX 1060 in 2023. If you haven’t bought RTX 3060Ti almost 3 years ago when it was released, why exactly would you buy the same thing 3 years later (now) with just slightly better power efficiency and nothing else? Because that’s what RTX 4060Ti is. Buying RTX 4060Ti wouldn’t give you any performance advantage over what you decided to skip 3 years ago. So, why do it now? It just makes no god damn sense. Some people said RTX 4060Ti would make sense if it was cheaper. But I say, why? How? RTX 4060Ti doesn’t make ANY sense at ANY price point because it doesn’t bring any performance increase over 3 years old product you’ve decided wasn’t worth buying back then. So, why buy same thing, but just slightly more power efficient, now? You’ll pay full fat price for it, but your games won’t gain any framerate compared to 3 years ago. Doesn’t that make RTX 4060Ti the most stupid product release in a really long time? It does.

You may ask why I’m only talking about RTX 4060Ti all this time and not really mentioning Radeon RX 7600 much? Frankly, it’s in similar position. They did elevate performance a bit relative to old gen where RX 7600 is quite comfortably faster even over RX 6650XT, which is the refreshed highest mid end from AMD and comparing it to actual direct predecessor, the RX 6600 would make even larger gap, so that’s great, but the issue is, they didn’t step up the performance game enough to compete with NVIDIA’s offerings from 3 years ago. Which makes me wonder, why bother either? That’s the sad reality. It’s cheaper, but then again, so is RX 6650XT. And so is RTX 3060Ti. It’s just such terrible time to make upgrades and unless you absolutely have to, just don’t. It’s post COVID stagnation bullshit and I hate it. Not buying current gen should send a message to both AMD and NVIDIA. I just worry NVIDIA won’t give two shits about it, because their corporate compute side is booming so hard with all that “NVIDIA in the trillion dollar club” and rise in whole Ai fuckery, I see nothing good for the future of PC gaming. Mark my words.

Wolfenstein 2: The New Colossus crashing on NVIDIA graphic cards

I’ve revisited this game after many years because I realized I haven’t finished it somehow. Well, since I’m clearing my backlog of games, why not finish it up. Until I got a repeating crashing of the game the moment I’ve entered the game. It never crashed in menus and if I didn’t move mouse in-game, it could run just fine which was weird, but the moment I moved the mouse, it crashed.

I couldn’t really diagnose it because all I got was a sudden freeze of the game with a Windows dialog “ding” sound. Ctrl+Alt+Del and invoking Task Manager and I could see error message saying “Could not write crash dump” (this isn’t the actual error, it’s just a symptom of game being unable to write a crash dump on the error it just experienced) and all the Windows Error logs only state unspecified error in NewColossus_x64vk.exe and that was it.

wolf_crash

Digging online I could find bunch of “solutions” that didn’t really work at all as game kept crashing no matter what. So I went with ALL settings OFF as that seemed to work, but it looked rubbish so I kept gradually enabling the settings. Basically I could enable it all until I enabled “Deferred Rendering” and the game instantly crashed again. For some reason this game really doesn’t like using Deferred Rendering method on GeForce RTX 3080 even with currently latest 532.03 drivers. Or with these in particular. Not in mood to downgrade drivers anyway.

SOLUTION

So, make sure “Deferred Rendering” is disabled in Advanced settings and it should work.

NVIDIA GeForce RTX 4060Ti is a massive joke and you should avoid it

So, NVIDIA just launched GeForce RTX 4060Ti graphic card. They officially launched 3 models at following MSRP prices which will be gradually released from this point after RTX 4060Ti 8GB model:

  • RTX 4060 8GB ($300)
  • RTX 4060Ti 8GB ($400)
  • RTX 4060Ti 16GB ($500)

The way NVIDIA positioned these models is that you’re heavily steered towards the 500 bucks model because the non-Ti model is just pathetic and has memory problem because a lot of games are showing issues with cards that only have 8GB of VRAM and that’s at only 1080p! In year 2023! Imagine how it’ll fare 2, 3 or more years in the future? Remember, mid range cards buyers don’t swap them every year or two, they hold them for much longer. So, going with Ti model solves nothing, you have to go to Ti 16GB model. For fucking 500 bucks. FIVE HUNDRED for a mid range card that has real world performance advantage over last generation card released almost 3 years ago basically within margin of error. I shit you not, RTX 4060Ti is just single digit percent better than RTX 3060Ti. And percent numbers are always huge and when you translate them into framerate it’s often just 3 or 5 frames per second difference. So, basically margin of error.

And reason for such lackluster showcase is the fact that NVIDIA cut down RTX 4060 so heavily on all ends it basically demolished any performance advantages of newer core architecture based on Ada Lovelace compared to Ampere architecture powering RTX 3000 series.

First they gave zero VRAM upgrade and it’s stuck at 8GB. Like mentioned, there are games that already saturate that at only 1080p. Then they heavily cut down the memory bus from 256bit found on RTX 3060Ti to just 128bit on RTX 4060Ti. NVIDIA’s excuse was increasing L2 cache within GPU core from 4MB to 32MB, but thing with caches is, they only work under ideal conditions. And when you have ideal conditions, it works beautifully, but when it doesn’t, you’re basically running at half a memory bus width and that is pretty severe downgrade.

To add insult to injury, NVIDIA also cut down PCIe bus width too. Instead of running at full x16 rate, PCIe bus only runs at 8x rate. That in itself isn’t an issue on modern PCIe 4.0 slot motherboard, but imagine if you have older system you’re upgrading with only PCIe 3.0 or even 2.0 ? That means PCIe slot runs way way slower. And one may say, you don’t need so much bandwidth on the slot anyway with mid end graphic card, but there is the problem, those 8GB VRAM is not enough and when game can’t fit resources on onboard memory of graphic card, it’ll spill it over PCIe bus into system memory.

Death by a thousand cuts

It may all sound insignificant on the surface, but when you lose a percent here and percent there on bunch of levels, you’re suddenly losing a lot of performance in the end, so much in fact that all the architectural advantages of Ada Lovelace on RTX 4060Ti are entirely negated to a point it’s basically as fast as RTX 3060Ti from years ago because NVIDIA just cut it down so hard. And don’t get me started with Resizable bar (ReBAR), if your system doesn’t support it, too little memory, cut down memory bus and cut down PCIe slot and it’ll just get even worse in this situation.

Did all this cutting of features away bring users any advantages? No, because the damn thing is as fast as RTX 3060Ti in the end, yet still costs the same (or more for 16GB model). You know what it did tho? It increased PROFIT MARGINS for NVIDIA. More VRAM costs money. Memory costs in peanuts for end users these days and for company buying in bulk with heavy discounts, even less. All straight into NVIDIA pocket when you pay $100 more for pathetic 8GB extra and also for the fact NVIDIA doesn’t give you more as standard. Smaller bus width, the wider the memory interface, it costs more because the GPU core needs more connections to the memory, costing more to manufacture in the end. Since NVIDIA cut it down, the difference goes into their pockets, because you as end user sure as hell aren’t benefiting from it. Cut down PCIe slot rate? Same story. Less complexity, less work, more goes into NVIDIA’s pockets. All the cuts weren’t done to make card cheaper for end user, all the cuts were done to ramp up profits for NVIDIA. Greedy bastards.

Are there even any benefits found on RTX 4060Ti ?

Only real benefit of RTX 4060Ti is slightly better power efficiency, AV1 encoder and decoder and DLSS3 which NVIDIA so desperately shills on all ends showcasing it how much better performance it makes, by showing some bullshit X times better graphs, entirely ignoring the fact that a) DLSS3 isn’t really supported in many games so they are selling performance advantage on premise of future updates of games supporting it b) ignoring the fact that inserting fake frames doesn’t make performance better. It makes smoother motion, but adding extra latency and visual artefacting. But they are just selling it as MOAR FRAMES PER SECOND and showing it in graphs with mega differences in framerate as instant winner over RTX 3000 series which don’t support frame generation despite having hardware support for it (optical flow which is supported since Turing architecture as far as I’ve checked).

Conclusion

If you already have RTX 3060Ti card, RTX 4060Ti is absolutely to be avoided. You’ll not be gaining anything unless hardware AV1 support is your absolute requirement. Everything else, including DLSS3 Frame Generation are all worthless benefits. You buy things on what they offer NOW, not on what they may offer in the future. DLSS performance gains are “nice to have”, because if card performs well now “as is”, slapping DLSS on it just means it’ll perform even better. If card performs the same as last generation from years ago and only advantage is “promised” performance gain with fake frames by using DLSS3, do you really want that? For full price? Yeah, I think not. Currently only few games support it and most existing games will never get support unless they are actively being developed by dedicated developers who are into this stuff. Most aren’t and never add such stuff to old games.

And even if you’re running something like RTX 2060, RTX 2060 Super or even older GTX 1600 or GTX 1650 or equivalent Radeon RX 480/570/580/590, I wouldn’t just rush to buy RTX 4060Ti either. AMD’s offerings are just around the corner with new RX 7600 cards and also price adjustments of older generations like RX 6650 XT or even RX 6700 series. Because currently, AMD’s offerings are neck on neck, they just have slight price disadvantage. But you can be sure AMD will adjust them accordingly pretty soon. And AMD also has FSR which is similar to DLSS. It has similar benefits and issues and it also falls into “nice to have” category, but I wouldn’t base my purchase on it either. And lastly, Intel’s ARC cards are pretty aggressive price wise and Intel made quite a lot of improvements to them.

My advice, don’t rush for RTX 4060Ti and wait for what AMD has to say. And then wait a bit for price adjustments depending on how RX 7600 will perform. You can still buy RTX 4060Ti then if numbers won’t be favorable to AMD. I personally think NVIDIA chased profits so hard by cutting things away they entirely fucked up this card that could otherwise be amazing if it had same raw specs as RTX 3060Ti, but with all the benefits of Ada Lovelace architecture. Same class of every new generation always brought pretty significant performance gains where now, it’s literally stagnating on same level as almost 3 years old card. That’s just greedy and bad. And this is the class of card that will be sold the most to the masses. All other higher models had gains, but not 60 series. Pathetic NVIDIA, really pathetic.