May. 8th, 2009
Python. float, damnit, float!!!
May. 8th, 2009 06:19 pmPython is, in most ways, a great language. You get used to the syntactically-significant whitespace and, with a reasonable editor, it doesn't eit you.
What I love is that you can fill a dictionary with arrays of dictionaries filled with arrays... That's something I've found horribly painful in Perl. It opens up some great possibilities, coding-wise. I'm not sure I can ever go back to Perl again, on account of that.
What bites me over and over, though, is the complete lack of typing. I don't like that you don't declare variables, because I'm not strong at spelling (or consistency in my spelling). But you catch those pretty fast, when something is "used before initialized". But it really got me when I intended something to be a float, but oops, I assigned an int to it, so then, for example, 1/5 results in zero. Trying to take the log of zero... very very painful! (See, my native language is C. I'm used to declaring once and for all that something is float, and then it's perfectly fine to say foo = 1, the compiler KNOWS I mean foo = 1.0.)
I'm still reeling with exhaustion after beating myself over THAT stupid bug.
What I love is that you can fill a dictionary with arrays of dictionaries filled with arrays... That's something I've found horribly painful in Perl. It opens up some great possibilities, coding-wise. I'm not sure I can ever go back to Perl again, on account of that.
What bites me over and over, though, is the complete lack of typing. I don't like that you don't declare variables, because I'm not strong at spelling (or consistency in my spelling). But you catch those pretty fast, when something is "used before initialized". But it really got me when I intended something to be a float, but oops, I assigned an int to it, so then, for example, 1/5 results in zero. Trying to take the log of zero... very very painful! (See, my native language is C. I'm used to declaring once and for all that something is float, and then it's perfectly fine to say foo = 1, the compiler KNOWS I mean foo = 1.0.)
I'm still reeling with exhaustion after beating myself over THAT stupid bug.