If you're looking to really understand how a computer functions (as in understand it, tooth-and-nail), I wouldn't use an FPGA.
I have several starter boards, and I have a blog where I discuss FPGAs, but if you're looking to get into computer hardware and logic and "how" the processing works, you have to start from the ground up.
This isn't an impossible task - in fact, if you search "homebrew CPU" on Google many results come up which is the bulk of them is guys building CPUs using usually 74-series logic. It may be a massive task, but it is far from impossible in every regard - if you put in the learning time to study how these things work, there's nothing stopping you from building hell, a discrete x86 CPU. Just don't expect it to be small or fast
I, for example, designed a 32-bit CPU that could be made with 74's, but never ended up building it because of the cost of boards and the cheap-college-student curse. I'd like to one day. That being said, I suggest you start in one of two ways:
1. Stop looking at a CPU as something that does "black-box" operations to binary code. Look at it as a calculator. Everything a CPU does can be replicated by moving data around and the basic four arithmetic operations (add, subtract, multiply, divide). In fact, from having studied the architecture, you can implement a full-fledged CPU and replicate every "legacy" instruction, even for x86 processors, with a CPU that can only perform incrementing, left and right bit shifting operations (3 of them).
2. Study basic logic. That is, how the individual bits are manipulated, along with timing. Timing is
absolutely critical. For example, you need to know how to do things such as have an ALU (arithmetic logic unit) generate a result, but create an input-output setup so that the ALU only loads and stores data on the rising edge of a clock signal. This is basic clock timing.
I've worked much with FPGAs, much with microcontrollers, and a good deal in software, but there was no learning task I enjoyed more than computer logic. Once you have the ground framework for how the computer actually processes, what it actually does to data, it's magical. It takes a lot of the mystery out of what and how a computer does what it does.
Best of luck. If you want guidance, let me know.