What's new
DroidForums.net | Android Forum & News

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

What makes the Galaxy Nexus so special

My list was simply a list of reasons why a person would want to pick the Galaxy Nexus over competing phones. I was not bashing any of the competition (well, except maybe the iPhone a wee bit hehe), I was giving a "pro's" list about the G-Nex.

The Rezound does have a fantastic screen, but I don't believe it is any better than the Nexus, nor do I necessarily believe it any worse. Thus even if both the Rezound and Nexus have identical screen quality, my statement remains correct: the Nexus has the best screen, and so does the Rezound. However, the Nexus has the advantage of being 4.65" compared to, I believe, 4.3". Not a HUGE difference, but it is enough...

The Nexus DOES have a superior chunk of hardware, I am confident in that. I am not new to the whole tech-marketing and product-development thing, and it's no different than the AMD Phenom II quad-core processors that could be unlocked to be a six-core processor: companies will "soft"-disable bits of hardware, such as turning off good cores, under-clocking, or (like Intel) eliminating a cache level. It usually starts out of "we have these two thousand six-core processors, but people are buying WAY MORE quad-core processors... so we could sit on them, or we could disable two cores, and sell it as a quad-core and make money now, without tainting our brand!". Sometimes, there are actual physical defects on the chips, but with the AMD example, 80%+ success rate for unlocking tells you that the primary reason was not chip defects. The Radeon 6850/6950 being able to be unlocked into a Radeon 6870/6970 is the same exact thing.

Thus it is with the Nexus. It shares the TI OMAP4460 with the Rezound, but I would bet $1,000,000 Internet Dollars that the Rezound has whatever chips passed the "necessary minimum" testing, while the Nexus phones all have chips that surpassed the "necessary minimum" testing by a certain percentage. Yet, they still chose to under-clock the chip from 1.5ghz to 1.2ghz, because frankly: 1.2ghz dual-core is still a hefty CPU for a PHONE!
There will be some point where the chips are unlocked via a software update to run at the full 1.5ghz, in order to keep them "up to date", and as I said before: I have NO DOUBT that the Galaxy Nexus will be the first phone to be able to run stable at over 2ghz, because they have the "most perfect" silicon in them. It's just how it is.
Flagship phone gets Flagship CPU with fewest/no defects that would hamper its potential.

Regarding 4G download speed, I cannot speak to the West Coast, as I am in the Midwest, but even in a city with over 2 million people (we've had LTE infrastructure since the testing-phases, so we're probably further-ahead than most) the Thunderbolt my friend has averages around 15Mbps downstream. Another friend's 4G card, for his MacBook Pro, gives him around 25Mbps consistently. One of the city's VZW stores I was at, the guy there told me he has seen the Nexus peak at 30Mbps downstream, just three days ago, when a manager had one out and was playing with it.


If you watch movies/TV and play games, this phone will have the screen to make them look life-like, and the hardware to keep it stutter-free while the 4G speed keeps the video from getting choppy, even at 720p while in a car constantly switching towers.

PLUS, YOU HAVE GOOGLE BEHIND YOU!

Right now they are having troubles just getting the nexus to 1.4ghz stable over on XDA..... Im not saying it wont make it to 1.5(which they cannot make stable right now.) but i dont see it making it to 2.0ghz. I forsee the rezound making to 2.0ghz, but not the gnex.
 
Right now they are having troubles just getting the nexus to 1.4ghz stable over on XDA..... Im not saying it wont make it to 1.5(which they cannot make stable right now.) but i dont see it making it to 2.0ghz. I forsee the rezound making to 2.0ghz, but not the gnex.

The reason it is not stable has NOTHING AT ALL to do with the chip. It was to do with the chip software clocking the chip at 2.4 and then dividing by two. And when you try to get to 1.5 GHz the phone thinks you are trying to clock it at 3GHz and says "hell no."

The chip can do it, they are just fixing TIs software since the only way to get 1.5 stable is to make it never clock bellow 1GHz.
 
Since I didn't feel like posting this again here is what I said in the pre release thread. For the record, the chip can hit 1.8+ and will hit 2.0 in no time but the software is complication matters.

"So this is kind of important, because the OMAP4460 has to switch between two ways of clocking between 1ghz and up it seems we can't get OC kernels that run under 1GHz when it is not needed.

The MPU DPLL is supposed to be run at 2.4GHz and the clock generator should divide by two for a 1.2GHz clock and it seems most phones won't go above around a 2.8GHz threshold which means that they need to clock with the DCC (allows for clocking w/o dividing by 1/2) e.g. 1.2GHz = 1.2GHz but as soon as the chip goes bellow the designated 1GHz the MPU DPLL is unable to clock to such a low speed so it disables the DCC but the OMAP chip seems to crash whenever disabling DCC.

So it seems to over clock you can't ever go bellow 1GHz if you want to keep the DCC enabled.

More Info: [DEVS/KERNEL] Galaxy Nexus 1.4GHz overclock + Undervolting patches - xda-developers"​






 
Since I didn't feel like posting this again here is what I said in the pre release thread. For the record, the chip can hit 1.8+ and will hit 2.0 in no time but the software is complication matters.

"So this is kind of important, because the OMAP4460 has to switch between two ways of clocking between 1ghz and up it seems we can't get OC kernels that run under 1GHz when it is not needed.

The MPU DPLL is supposed to be run at 2.4GHz and the clock generator should divide by two for a 1.2GHz clock and it seems most phones won't go above around a 2.8GHz threshold which means that they need to clock with the DCC (allows for clocking w/o dividing by 1/2) e.g. 1.2GHz = 1.2GHz but as soon as the chip goes bellow the designated 1GHz the MPU DPLL is unable to clock to such a low speed so it disables the DCC but the OMAP chip seems to crash whenever disabling DCC.

So it seems to over clock you can't ever go bellow 1GHz if you want to keep the DCC enabled.

More Info: [DEVS/KERNEL] Galaxy Nexus 1.4GHz overclock + Undervolting patches - xda-developers"​







Wow, that kinda sux actually... Hope they figure it out...

My Rezound Rocks the Red n Black... Get over it... Now to get this thing rooted
 
Wow, that kinda sux actually... Hope they figure it out...

My Rezound Rocks the Red n Black... Get over it... Now to get this thing rooted

For the time being the simple fix is to stay above 1GHz at all times, anything bellow 1GHz cannot be reached without the clock/2 DDC but if you are ok with some wasted battery it is ok.

What we can always try to do is to import the software from another chip such as the Exynos and work on optimizing it for the OMAP. But that is a long shot.
 
For the time being the simple fix is to stay above 1GHz at all times, anything bellow 1GHz cannot be reached without the clock/2 DDC but if you are ok with some wasted battery it is ok.

What we can always try to do is to import the software from another chip such as the Exynos and work on optimizing it for the OMAP. But that is a long shot.

That is definitely a long shot... Why not optimize an old set like say double the instructions from a 3360 instead?

My Rezound Rocks the Red n Black... Get over it... Now to get this thing rooted
 
That is definitely a long shot... Why not optimize an old set like say double the instructions from a 3360 instead?

My Rezound Rocks the Red n Black... Get over it... Now to get this thing rooted

Because unfortunately only the TIs chips that go above the 1GHz clock (that were intended to go above that, I should say, the OG droid can get 1GHz easily) use the same clocking method so if it was used clocking above 1.5 or so GHz probably is not feasible. But it is worth a shot. The reason an Exynos could worth trying is simply because those chips can and have reached above the 3.0GHz clock that the OMAP is not able to get to. In all honesty though, I have no idea how that would be done. Or even if it is possible.
 
Don't forget that the SnapDragon MSM8660 only supports single-channel memory, while the TI OMAP4460 has a dual-channel memory controller, theoretically doubling the effective bandwidth of the phone.... I have glanced over the chip architecture for both, and while I have a basic understanding of how the two work, I am not clear on how they handle RAM? Is it more like Intel and their Front-Side Bus or more like AMD and their HyperTransport? Either way, a slight boost to memory speed could come out to being just as effective as a bigger-boost to CPU clock speed...

I know that I notice more of a difference by bumping up my Phenom II x6 1100T's HT-Link by ~10-20mhz than I do by jumping the CPU clock from 4.7ghz to 4.9ghz...
Does the same theory apply here?
 
Also, I've been looking around at benchmarks for the stock phones, and here's the specs on the following phones...
DROID RAZR
CPU: 1.2ghz Dual-Core TI OMAP4430
GPU: PowerVR SGX540 @ 300mhz
RAM: 1GB LPDDR2 Single-Channel
HTC REZOUND
CPU: 1.5ghz Dual-Core Qualcomm MSM8660
GPU: Adreno 220
RAM: 1GB LPDDR2 Single-Channel
GALAXY NEXUS
CPU: 1.2ghz Dual-Core TI OMAP4460
GPU: PowerVR SGX540 @ 384mhz
RAM: 1GB LPDDR2 Dual-Channel

The Motorola RAZR is beating the Rezound in almost all of the benchmarks they're put up against one another in, although how much of that has to do with the Sense/Blur UI's is unclear. The following scores will be the RAZR followed by REZOUND (gotten from Engadget):
QUADRANT: 2,798 vs 2,347
LINPACK SINGLE (MFLOPS): 50 vs 52
LINPACK MULTI (MFLOPS): 95.6 vs 60.3

Clearly, the RAZR has the edge here. I use it for comparison because it is most-similar to the Nexus in hardware. However, the Nexus, unlike the RAZR, will have the following advantages:
- No OEM UI (Sense/Blur)
- OMAP4460 vs OMAP4430 (higher quality chip with more overhead and dual-channel support)
- 1GB Dual-Channel Memory (RAZR/Rezound are only Single Channel)
- PowerVR SGX540 @ 384mhz (RAZR is only at 304mhz) - according to the GPU's scaling, should be good for at least another 1.2-1.5GFLOP/s

I wouldn't be surprised to see the Galaxy Nexus pull scores that are 30-50% higher than the RAZR due to these differences.
While I'm not an expert on smartphone silicon, I can say that I am pretty knowledgeable when it comes to PC silicon, and the difference between a PC running single vs dual channel RAM, a ~30% increase in GPU speed/throughput, and a stepped-up CPU make for quite a boost in benchmark scores.
 
BTW, just to test my theory I tried something with my gaming rig:
I've got a heavily-OC'd and liquid-cooled AMD/ATI setup (Asus Crosshair V, Phenom II X6 1100T @ 4.7ghz, 32GB DDR3-1600 RAM, 2x Radeon HD6970's OC'd, 2x240gb SSD's, 2x120GB SSD's, 1x240GB PCIe x4 SSD, RAID10 2TB 7200rpm HDD's, external-enclosure water cooling w 3 main 360mm radiators with 3push/3pull fans each, 2L refrigerated reservoir, 3 main pumps, 3 inline pumps, 3 inline 120mm radiators, 1/2" tubing, pure-copper heatsinks, etc - resulting in 3 separate H2O lines and temps that STAY at ambient unless pushed hard).

Just to see what difference the RAM makes on its own, I took out 3 of the 4 8GB sticks and ran PCMark. Then, I took out the last 8GB stick and tossed in 2x4GB DDR3-1600 RAM sticks (both are 4/8GB sticks are G.Skill Ripjaws X series, timings were set identically) and ran the same exact PCMark test with the ONLY difference being dual-channel versus single-channel.

423point difference overall, in favor of the dual-channel. There are likely many more variables at play, seeing as how I have a processor running 1.5ghz higher than normal, both GPU's running almost 200mhz faster than normal (with shader and memory speeds increased by ~400mhz and ~1ghz respectively), while NB/SB on the MoBo are liquid-cooled and overclocked, as is the HyperTransport-Link.

Clearly, dual-channel versus single-channel will have noticeable performance effects, particularly combined with a dual-core processor, assuming that the memory controller is optimized to take advantage of dual-channel DDR2, which is 4x effective memory bandwidth.
 
BTW, just to test my theory I tried something with my gaming rig:
I've got a heavily-OC'd and liquid-cooled AMD/ATI setup (Asus Crosshair V, Phenom II X6 1100T @ 4.7ghz, 32GB DDR3-1600 RAM, 2x Radeon HD6970's OC'd, 2x240gb SSD's, 2x120GB SSD's, 1x240GB PCIe x4 SSD, RAID10 2TB 7200rpm HDD's, external-enclosure water cooling w 3 main 360mm radiators with 3push/3pull fans each, 2L refrigerated reservoir, 3 main pumps, 3 inline pumps, 3 inline 120mm radiators, 1/2" tubing, pure-copper heatsinks, etc - resulting in 3 separate H2O lines and temps that STAY at ambient unless pushed hard).

Just to see what difference the RAM makes on its own, I took out 3 of the 4 8GB sticks and ran PCMark. Then, I took out the last 8GB stick and tossed in 2x4GB DDR3-1600 RAM sticks (both are 4/8GB sticks are G.Skill Ripjaws X series, timings were set identically) and ran the same exact PCMark test with the ONLY difference being dual-channel versus single-channel.

423point difference overall, in favor of the dual-channel. There are likely many more variables at play, seeing as how I have a processor running 1.5ghz higher than normal, both GPU's running almost 200mhz faster than normal (with shader and memory speeds increased by ~400mhz and ~1ghz respectively), while NB/SB on the MoBo are liquid-cooled and overclocked, as is the HyperTransport-Link.

Clearly, dual-channel versus single-channel will have noticeable performance effects, particularly combined with a dual-core processor, assuming that the memory controller is optimized to take advantage of dual-channel DDR2, which is 4x effective memory bandwidth.

^^^ I like what this dudes saying

Sent from my DROID2 using Tapatalk
 
BTW, just to test my theory I tried something with my gaming rig:
I've got a heavily-OC'd and liquid-cooled AMD/ATI setup (Asus Crosshair V, Phenom II X6 1100T @ 4.7ghz, 32GB DDR3-1600 RAM, 2x Radeon HD6970's OC'd, 2x240gb SSD's, 2x120GB SSD's, 1x240GB PCIe x4 SSD, RAID10 2TB 7200rpm HDD's, external-enclosure water cooling w 3 main 360mm radiators with 3push/3pull fans each, 2L refrigerated reservoir, 3 main pumps, 3 inline pumps, 3 inline 120mm radiators, 1/2" tubing, pure-copper heatsinks, etc - resulting in 3 separate H2O lines and temps that STAY at ambient unless pushed hard).

Just to see what difference the RAM makes on its own, I took out 3 of the 4 8GB sticks and ran PCMark. Then, I took out the last 8GB stick and tossed in 2x4GB DDR3-1600 RAM sticks (both are 4/8GB sticks are G.Skill Ripjaws X series, timings were set identically) and ran the same exact PCMark test with the ONLY difference being dual-channel versus single-channel.

423point difference overall, in favor of the dual-channel. There are likely many more variables at play, seeing as how I have a processor running 1.5ghz higher than normal, both GPU's running almost 200mhz faster than normal (with shader and memory speeds increased by ~400mhz and ~1ghz respectively), while NB/SB on the MoBo are liquid-cooled and overclocked, as is the HyperTransport-Link.

Clearly, dual-channel versus single-channel will have noticeable performance effects, particularly combined with a dual-core processor, assuming that the memory controller is optimized to take advantage of dual-channel DDR2, which is 4x effective memory bandwidth.

Nerdgasm! Man I bet that PC smokes!!
 
Back
Top