• Welcome to Valhalla Legends Archive.
 

Type casting or alternative

Started by Yegg, April 19, 2009, 01:27 AM

Previous topic - Next topic

Yegg

Out of the two bits of code below, is one "better" (if so in anyway, explain) than the other, or does it boil down to preference? Assuming n is equal to 128845834980000000

int x = (unsigned int *) n;

int x = n % 4294967296;

First off, I understand that the cast in the first piece of code isn't required, though it keeps the compiler from warning about a possible overflow.
Both pieces of code will produce the same value. Is one better than the other? Why? Or is it just preference?

brew

Well, the first bit of code would error on compile. you're trying to assign an int the value of an unsigned int *, which doesn't work out too well.
The second, assuming n is an __int64, would work.
<3 Zorm
Quote[01:08:05 AM] <@Zorm> haha, me get pussy? don't kid yourself quik
Scio te esse, sed quid sumne? :P

Yegg

Quote from: brew on April 19, 2009, 09:56 AM
Well, the first bit of code would error on compile. you're trying to assign an int the value of an unsigned int *, which doesn't work out too well.
The second, assuming n is an __int64, would work.

I ended up also asking this on another forum. I made a few errors on my part when I asked this question, a main one being that I never define n in my code. n is only used in asking this question to make it hopefully easier to read, but that didn't turn out to be the case. In the code, you would see int x = (unsigned int)128845834980000000. The asterisk wasn't supposed to be there, sorry.

As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.

The other error I made was that I meant it to be unsigned int x, not just int x. Since n was not defined, and 128845834980000000 was used as an expression when declaring and defining x, I figure the compiler was able to convert it to the right type at compile-time.

brew

Quote from: Yegg on April 19, 2009, 11:30 AM
As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.
I don't use either apple gcc or vc9. But I don't understand how it would compile, as you're explicitly changing an unsigned int into an unsigned int *, then trying to assign that value to an int. I would think it'd type error. Weird.
<3 Zorm
Quote[01:08:05 AM] <@Zorm> haha, me get pussy? don't kid yourself quik
Scio te esse, sed quid sumne? :P

Yegg

Quote from: brew on April 19, 2009, 01:18 PM
Quote from: Yegg on April 19, 2009, 11:30 AM
As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.
I don't use either apple gcc or vc9. But I don't understand how it would compile, as you're explicitly changing an unsigned int into an unsigned int *, then trying to assign that value to an int. I would think it'd type error. Weird.
A few people on Devshed forums (forums.devshed.com) said it would not compile as well, including one person who answers most of the C questions on the whole C programming forum over there. He also mentioned that I should set Apple gcc to -Wall -Werror. That will bring up all possible warnings.
What compiler are you using? Regular gcc? Something else?

brew

Quote from: Yegg on April 19, 2009, 02:19 PM
What compiler are you using? Regular gcc? Something else?
I use MSVC6 and regular gcc. Perhaps the compiler was smart enough to see that the cast was completely unnecessary since the two variables are of the same type, and removed it before it could error.
<3 Zorm
Quote[01:08:05 AM] <@Zorm> haha, me get pussy? don't kid yourself quik
Scio te esse, sed quid sumne? :P

Yegg

Quote from: brew on April 19, 2009, 05:50 PM
Quote from: Yegg on April 19, 2009, 02:19 PM
What compiler are you using? Regular gcc? Something else?
I use MSVC6 and regular gcc. Perhaps the compiler was smart enough to see that the cast was completely unnecessary since the two variables are of the same type, and removed it before it could error.

That's what I kind of figured. Though I'll be changing some gcc settings to increase warnings so that I can stay even farther away from bad code.