Well, you heard my head explode as well despite me not being a C programmer. The syntax of C++, C# and Java is based on C though. But I do get the idea you're referring to. It's not intuitive when "3" != 3. On the other hand, strongly typed languages have their pros, such as having a better (mental) control of your data. In my opinion, at least, since I don't really want "3" to be 3, since they're very different variables with different methods and so on.
Oh, I didn't want you to answer homework questions. I was merely thinking about the principle of putting JSON data into an array as JSON objects. I mean, it works and I've got the homework finished and working. The thing is that I feel wrong about doing so even though it works.
Hah. It sounds wrong to just say "3" equals 3...
For those besides the programmer-types here who're having difficulty following along: We're discussing the various ways that a program or script retains a value.
There are myriad types -- that is, the kind of value that a value is. The two simplest examples are the string and the number.
"3" is a string -- it's the textual expression of the number three; while
3 is a number -- it is the number itself, an element that can be manipulated mathematically. There are also common types like boolean (a value of either TRUE or FALSE), array (a structured set of values), and so on. In most languages, when you compare two things, the type is relevant to the comparison by default. So in Java, C, and as far as I know most other common languages, "3" does not equal 3; but in Javascript, PHP, and many other recent scripting languages, "3" does equal 3 unless you require the environment to also take type into consideration. In programmatic shorthand:
Code:
// Two equals signs mean "compare the value without considering type"
3==3 // true in Javascript
"3"==3 // also true in Javascript
// Three equals signs mean "compare the value and the type"
3===3 // True in Javascript
"3"===3 // False in Javascript; a string cannot equal a number
// Things can get weird when type isn't accounted for...
"0"==false // True in Javascript
Anyway. Usually you want to make sure that when you're comparing two things, you're as strict as possible in the comparison -- you usually don't want
"0" to equal
FALSE. But there are times, particularly when you work on web pages, where you can't get what you want in the way you want it -- numbers and booleans all arrive as strings. The classical way of dealing with this is to convert the data -- every value that arrives as the string "false" gets replaced with the boolean FALSE. But when you're dealing with a flood of incoming data, that becomes the slower, less-convenient method that consumes more execution time, and Javascript's available shortcut of allowing "3" to equal 3 becomes a good-enough initial data filter. (Data which you, as a responsible programmer, later re-check and sanitize!)
And, CdC -- sorry, didn't mean to sound like I was accusing you of wanting to cheat. It's mostly a problem of mine to not know when to stop thinking aloud. As you can see.:rolleyes: