Before the invention of "high level" programming languages like Java, to program a computer you had to speak it's native language -- the language of the processor -- which was encoded in binary as a sequence of 0s and 1s. Each binary sequence meant to do a certain operation... e.g., 0010101 might mean ADD and 0010111 might mean LOAD a number from memory. This is called "machine language", because it's the language of the machine. The data being manipulated would also be encoded as binary numbers.
Eventually programmers came up with shorthands (e.g. "ADD" or "LOAD") called "assembly language" that they could write, and which would be automatically translated into the actual binary sequences that the computer runs. Assembly language is still a "low level" language, because it's similar to how the machine operates, and because it requires a LOT of lines of code to do anything useful.
Later, computer scientists invented high level languages (like Java) that were more expressive, and easier to read/write, and the compiler translates them into the binary. (But a compiler itself is a program -- how did they write the first compiler? Well, they had to write it in assembly language or machine language...)