Hope you don't mind, but I'm ignoring your post until "So my question is". I have two things to say about that:
First:
Let's work by analogy. Suppose we divide 100 by 3, with remainder. Then we get
100 / 3 = 33 remainder 1.
In other words,
100 / 3 = 33 + 1/3
(this is exactly what division with remainder means)
In your situation,
1/0 = 0 remainder 1
is re-written as
1/0 = 0 + 1/0.
(can you see why it's not helpful?)
---------------
Second:
You shouldn't mistake an algorithm used to divide for actual division. Suppose we tried using this algorithm here:
How many times does 0 go into 1? Well, 700 * 0 = 0, so write down 7 in the hundreds place and subtract 0 from 1 to get 1.
How many times does 0 to into 1? Well, 5000 * 0 = 0, so write down 5 in the thousands place and subtract 0 from 1 to get 1.
etc.
Do you see the problem with this algorithm when attempting to divide by 0? For instance, from the above, I would be able to write
1/0 = 5700 + 1/0
which doesn't even "feel" right. Also, why did I choose 7 and 5? There's no reason I couldn't choose any other number, so the algorithm just fails when you try to divide by 0.