User Name
Password
AppleNova Forums » Apple Products »

Mac mini with M2 and M2 Pro


Register Members List Calendar Search FAQ Posting Guidelines
Mac mini with M2 and M2 Pro
Page 2 of 2 Previous 1 [2]  Thread Tools
Matsu
Veteran Member
 
Join Date: May 2004
 
2023-01-25, 09:43

I was surprised to recently find somewhat affordable 8K TVs in the mass market, I guess I haven't been paying attention. I wonder why computer displays are so far behind?

.........................................
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2023-01-25, 10:36

Apple’s GPU would be struggling at 8k, even the higher end units in the Mac Studio are lower mid-range in terms of performance. Wouldn’t push it for anything more than documents and video playback.

Computer monitors are often behind TVs in terms of cutting edge resolution, that’s really nothing new. 4K TVs were out before 4K computer monitors as well. TV have a lot less work to do to push more, they are just doing video playback. TVs skipped 2k (1440p), likely due to a complete lack of native content, something computers didn’t have to worry about.
  quote
Matsu
Veteran Member
 
Join Date: May 2004
 
2023-01-25, 11:44

My 8K computer interest is primarily in viewing still images.

But, I also like viewing movies, a lot of "classics" from the 60's through 90's, and the newest independent and studio fare. I haven't been to a theatre in 3 years, but it's likely that most theatres don't have more than DCI-4K, and may not be showing more than DCI 2K a lot of the time. At home we're watching consumer 4K and I'm impressed by how good/natural 2K and 4K transfers* of an old movie can look. You're seeing film grain/noise texture in the 16mm/35mm source material. Not sure what 8K gains here, if anything, unless other processing trickery is applied. Now 65/70mm is probably a different story, but we will also need a bigger screen to recreate the experience...

(*is it technically a master if the source was film? I guess that's why they celebrate "remasters" from original reels/prints)

I have the time to make a basement family/cinema room. Going to be about a 8-10ft viewing distance with a big couch seating four-five people across. I think the screen should be about 75-80" to give it the right impact. Not sure if 8K will matter, no source material really... Maybe just a high quality 4K with really good black levels.

.........................................
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2023-01-25, 15:57

Whether or not 8k is worth it will depend on the content and how large a screen you get. If a TV over 100”, most likely if you are close enough. As in less than 15ft away. If you are going to use a projector and screen, you will pay big time for that projector. I I don’t know if they even go past 4K.
  quote
Matsu
Veteran Member
 
Join Date: May 2004
 
2023-01-25, 16:40

I don't think hardly any theatre is showing more than DCI 4K, and many may still be showing DCI 2K (upscaled)... People are often surprised by this, considering the screen is so big, but they're usually a lot further back A big difference is that the DCP file they're using is a 4:4:4 with significantly higher bit rate, so the motion and color resolution compares very well against streaming and home media sources with the same nominal resolution. And there are other factors too, like the more precise control of lighting among other things. They're not just plopping a 4K Blu-Ray in there and showing it.

That said, there's some very good streaming and BD playback available now. If people take some care to set up their viewing environment they can get a very good experience for motion picture viewing. I imagine that most will not perceive a real benefit from 8K, and should probably prioritize color, dynamic range, black levels, viewing angles, and how well the set works in the ambient lighting of their viewing room. (plus marks for working well in a dim room, which is how films are supposed to be viewed...) All this should probably be factored over the difference between 4K and 8K...

Last edited by Matsu : 2023-01-25 at 19:32.
  quote
Marcellus Wallace III
New Member
 
Join Date: Jan 2023
Location: The Netherlands
 
2023-01-25, 17:22

I believe I am totally 8k no matter the viewing angle.
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2023-01-25, 18:55

Quote:
Originally Posted by PB PM View Post
TVs skipped 2k (1440p),
1080p / Full HD is 2K. The K refers to the rounded horizontal pixels. Hence 3840x2160 = 4K if you squint and round up a lot.

1440p is 2.5K (2560).
  quote
Matsu
Veteran Member
 
Join Date: May 2004
 
2023-01-26, 07:47

I'm not so sure TV's have routinely been ahead of computer displays. They progressed from regional Standard def analog systems with a maximum broadcast quality of 720x480 or 720x576 interlaced, and in practice often much less, VHS quality could be half or a quarter of that, 350x240ish... TV's weren't showing even 480p until digital/DVD. Small niches for analog HD TV in Japan and Laserdisc nerds could do better, but virtually no one had that. The first five figure plasmas were 480p widescreen affairs (850x480), and the first gen of HDTV was only 1280x720. 1080P sets only really hit in the late 2000's... I'd had at least a 1024x768 or better display since the mid-90's, and 2560x1440p/2560x1600p for over a decade. My computer displays didn't fall behind TV tech until 4K became popular...

.........................................
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2023-01-26, 11:04

Quote:
Originally Posted by chucker View Post
1080p / Full HD is 2K. The K refers to the rounded horizontal pixels. Hence 3840x2160 = 4K if you squint and round up a lot.

1440p is 2.5K (2560).
Right, not sure how I forgot that.
  quote
PB PM
Sneaky Punk
 
Join Date: Oct 2005
Location: Vancouver, BC
Send a message via Skype™ to PB PM 
2023-01-26, 11:09

Quote:
Originally Posted by Matsu View Post
I'm not so sure TV's have routinely been ahead of computer displays. They progressed from regional Standard def analog systems with a maximum broadcast quality of 720x480 or 720x576 interlaced, and in practice often much less, VHS quality could be half or a quarter of that, 350x240ish... TV's weren't showing even 480p until digital/DVD. Small niches for analog HD TV in Japan and Laserdisc nerds could do better, but virtually no one had that. The first five figure plasmas were 480p widescreen affairs (850x480), and the first gen of HDTV was only 1280x720. 1080P sets only really hit in the late 2000's... I'd had at least a 1024x768 or better display since the mid-90's, and 2560x1440p/2560x1600p for over a decade. My computer displays didn't fall behind TV tech until 4K became popular...
Maybe my memory is playing tricks on me? I guess there were a few high end 2.5k monitors like ACD that came out in the early 2000s. Dates for that stuff is a little fuzzy in my mind, so much was coming out so fast between 2000-2010 time frame
  quote
dglow
Member
 
Join Date: May 2004
 
2023-01-27, 13:29

Quote:
Originally Posted by chucker View Post
I think it's simply that Apple increased the bandwidth. Their HDMI port works by converting the signal (from DisplayPort, I think); Apple's M chips have no native HDMI signal.
Apple does support HDMI natively. It needs to – HDMI uses a completely different signaling method from DisplayPort, one derived from the older DVI standard.

The most significant difference with DisplayPort (other than avoiding HDMI’s royalties) is that it is a packetized protocol, which is what allows it to be tunneled over Thunderbolt / USB4.

Quote:
Originally Posted by PB PM View Post
Maybe my memory is playing tricks on me? I guess there were a few high end 2.5k monitors like ACD that came out in the early 2000s. Dates for that stuff is a little fuzzy in my mind, so much was coming out so fast between 2000-2010 time frame
Nah, you’re spot-on: the 30” Cinema Display, 2560x1600, debuted in 2004. We must thank Apple for sticking to 16:10 ratio displays; anyone who has used a 1440p monitor knows it is too short for anything not games or video.

The 30” ACD was a minor wonder of the world at the time. That much real estate cost a cold $3300. At ~$5150 in today’s money, it places the Pro XDR’s pricing in some context.

For the record, in today’s money the 30” ACD’s stand cost an additional $0.00.

Last edited by dglow : 2023-01-28 at 12:12.
  quote
Matsu
Veteran Member
 
Join Date: May 2004
 
2023-01-30, 05:05

8K TVs use HDMI 2.1. It's interesting because TV's HDMI implementation/modes allow for variation in bit depth and chroma subsampling depending on source and output. You could pass anything from 8bit 4:2:0 up to 12bit 4:4:4 depending on resolution and framerate. As alluded to above, manufacturers have not always been clear about exactly what their HDMI implementation supports, and it could be the case that the 8K input on some sets/sources doesn't have all the data preserved from one device to the other but the TV tries to hide/recreate it through signal processing.

.........................................

Last edited by Matsu : 2023-01-30 at 06:35.
  quote
chucker
 
Join Date: May 2004
Location: near Bremen, Germany
Send a message via ICQ to chucker Send a message via AIM to chucker Send a message via MSN to chucker Send a message via Yahoo to chucker Send a message via Skype™ to chucker 
2023-01-30, 05:30

Quote:
Originally Posted by dglow View Post
Apple does support HDMI natively. It needs to – HDMI uses a completely different signaling method from DisplayPort, one derived from the older DVI standard.
Like I said, Apple uses a conversion chip to go from DP signaling to HDMI signaling.
  quote
dglow
Member
 
Join Date: May 2004
 
2023-01-31, 02:17

Quote:
Originally Posted by chucker View Post
Like I said, Apple uses a conversion chip to go from DP signaling to HDMI signaling.
Ah, I missed the part where you said ‘chip’. But there’s no reason the source must be a DP signal; it’s just as likely a data bus feeding a buffer on the conversion chip.
  quote
Posting Rules Navigation
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Page 2 of 2 Previous 1 [2] 

Post Reply

Forum Jump
Thread Tools
Similar Threads
Thread Thread Starter Forum Replies Last Post
Netbooks... Dell Mini 10 vs HP Mini 210 iFerret Purchasing Advice 12 2010-12-10 12:59
Apple Mac Mini and mini dvi to vga adapter photoworks Apple Products 5 2009-07-08 07:07
Mac mini's mini-DVI -> Apple's mini-DVI-to-DVI adapter -> DVI-to-HDMI cable -> TV? Robo Purchasing Advice 17 2009-04-05 00:27
Apple offering rebates on Mac mini -> new mini soon? buybuybuy Speculation and Rumors 8 2006-07-27 16:11
Mini, regular iPod, back to mini... Messiahtosh General Discussion 12 2004-09-04 10:42


« Previous Thread | Next Thread »

All times are GMT -5. The time now is 11:12.


Powered by vBulletin®
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004 - 2024, AppleNova