Originally Posted by Coq de Combat
Well, you heard my head explode as well despite me not being a C programmer. The syntax of C++, C# and Java is based on C though. But I do get the idea you're referring to. It's not intuitive when "3" != 3. On the other hand, strongly typed languages have their pros, such as having a better (mental) control of your data. In my opinion, at least, since I don't really want "3" to be 3, since they're very different variables with different methods and so on.
Oh, I didn't want you to answer homework questions. I was merely thinking about the principle of putting JSON data into an array as JSON objects. I mean, it works and I've got the homework finished and working. The thing is that I feel wrong about doing so even though it works.
Hah. It sounds wrong to just say "3" equals 3...
For those besides the programmer-types here who're having difficulty following along: We're discussing the various ways that a program or script retains a value.
There are myriad types -- that is, the kind of value that a value is. The two simplest examples are the string and the number. "3"
is a string -- it's the textual expression of the number three; while 3
// Two equals signs mean "compare the value without considering type"
// Three equals signs mean "compare the value and the type"
// Things can get weird when type isn't accounted for...
Anyway. Usually you want to make sure that when you're comparing two things, you're as strict as possible in the comparison -- you usually don't want "0"
to equal FALSE
And, CdC -- sorry, didn't mean to sound like I was accusing you of wanting to cheat. It's mostly a problem of mine to not know when to stop thinking aloud. As you can see.