My name is Azhar and i am a big fan of your applets and i use it quite a bit in class to teach my classes.
Recently, I was using the thin lens demonstration and i noticed that as i brought the object from distance further away from 1 focal length to exactly to the focal length (object distance = focal length). The light rays go to infinity as expected.
However, as we bring the object from distance less than 1 focal length to exactly one focal length, light rays are shown to come out in the direction of the object, which indicates that there might be a virtual image formed somewhere far far away.
I genuinely want to know which situation is really true.
This above is an email message I received today.
Light ray goes to infinity when the object is placed at the focus point. i.e. object distance=focus length.
When the object distance is less than focus length(no matter how small the difference might be), it will form virtual image.
When the object distance is more than focus length(no matter how small the difference might be), it will form real image.
From mathematic point of view or with the simulation, we can set object distance=focus length.
However, it is not possible to do so in real life. Because object has finite width and measurement always has finite error.
The simulation was designed so that some student might raise question and think about what really happened.
The purpose of my simulation is not to provide answer to the question student might have, but to provide opportunity for students to think about related physics.
I would suggest you to ask student discuss : is it possible to set object distance = focus length in real life?
Object has finite width. What will be seen when the center of mass of the object is placed at the focus point.
Part of the object distance is less than focus length and part of the object distance is larger than the focus length.
Provide opportunity for student to think is more important than provide answer to the student! :-)