An Assertion is a boolean function added by the programmer that can help a program to detect its own defects (typos, logical errors in the code, erroneous output values, etc). Unlike classic exceptions or classic exception error handling (ie. deliberate crash with msg, say "error XY in line 3678"), such function can then also be used to log the line (or even x lines before the assertion was triggered, as Dave described) where the failure occured but also to proceed with program execution (if it's not a fatal exception error) until say the scenario end has been reached or until one of the AIs has wiped out all enemies, or it can load a savegame and just proceed with the game. If no assertions come up during autotesting anymore, betatesters will then test a new (corrected) version, who then - during their test runs - put the focus on functions/situations that had triggered misbehaviours/crashes/whatever in the previous version.
The betatesters are using the debug version, so they have additional tools which the regular user/player doesn't have. Depending on the amount of debug tools and functions included, the betatester may have the possibilty to discover funky AI behaviour, as he may have the ability to remove the fog of war, for instance, or he may add handicaps for the enemy AI to slow down the enemy AI and ease observation.
Big companies like say EA employ quite a few software egineers in their quality assurance divisions, where they use (or even develop their own) autotest environments and where then games are autotested on a number of machines and different hardware/operating systems. Some divisions have an entire "machine park" for autotesting, some of them just autotest particular game modules, not the entire game and leave the detailed testing to inhouse testers.
IIRC, Dave used to have one dedicated machine for autotests.