Dude, I'm a graduate student (although not in maths) but I've gone through a host of Maths courses (Calculus, Algebra, Statistics) and I've never gotten anything less than 87%. I'm a graduate student in economics, you don't have to teach me maths.
Students of mathematics often reject the equality of 0.999… and 1 for reasons ranging from their disparate appearance to deep misgivings over the limit concept and disagreements over the nature of infinitesimals. There are many common contributing factors to the confusion:
Students are often "mentally committed to the notion that a number can be represented in one and only one way by a decimal." Seeing two manifestly different decimals representing the same number appears to be a paradox, which is multiplied by the appearance of the seemingly well-understood number 1.
Some students interpret "0.999…" (or similar notation) as a large but finite string of 9s, possibly with a variable, unspecified length. If they accept an infinite string of nines, they may still expect a last 9 at infinity.
Intuition and ambiguous teaching lead students to think of the limit of a sequence as a kind of infinite process rather than a fixed value, since the sequence never reaches its limit. Those who accept the difference between a sequence of numbers and its limit might read "0.999…" as meaning the former rather than the latter.
All known authorities agree that these ideas are mistaken in the context of the standard real numbers. On the other hand, many of them are partially borne out in more sophisticated structures, either invented for their general mathematical utility or as instructive counterexamples to better understand 0.999….
Many of these explanations were found by professor David Tall, who has studied characteristics of teaching and cognition that lead to some of the misunderstandings he has encountered in his college students. Interviewing his students to determine why the vast majority initially rejected the equality, he found that "students continued to conceive of 0.999… as a sequence of numbers getting closer and closer to 1 and not a fixed value, because 'you haven’t specified how many places there are' or 'it is the nearest possible decimal below 1'".
A typical calculator cannot help one reason with 0.999...Joseph Mazur tells the tale of an otherwise brilliant calculus student of his who "challenged almost everything I said in class but never questioned his calculator," and who had come to believe that nine digits are all one needs to do mathematics, including calculate the square root of 23. The student remained uncomfortable with a limiting argument that 9.99… = 10, calling it a "wildly imagined infinite growing process."
That explains why I don't think .999 equals 1 and obviously I'm not alone. I believe .99r is not 1 but the number closest to 1.