I'm new, but that's already a given.
Back then, I liked math because there is a lot of application to it. I learned math to solve problems, not learn math because of problems.
Currently, I like both the learning for the sake of learning, and the real life applications.
I like figuring out math on my own, and consider explanations of how to do certain things as "spoilers".
I'm a thinker that connects information to draw conclusions, run tests to see patterns, and run tests to confirm my conclusions.
On the contrary, I'm a very bad memorizer.
Because of the two traits above, I like to know the reasons behind math, not simply memorizing a big equation or concept with no rhyme or reason. I think I would prefer that even with great memory.
Um, that's all I can think of for now... Ask away, I guess?
I have thought about these for a long time, but never encountered anything about these in math. If it exists out there, I'd like to know which subject it is under and what resources I could use to learn about them.
Also, I'm kinda new, so if there are any shortcuts/tools to use math symbols, that would be much appreciated. ^^
1) Can every pattern be represented by a function?
E.g. Prime Numbers: 1, 2, 3, 5, 7, 11, 13, 17, 19, 23, etc...
Supposedly "Random" Phenomenon: 7, 7, 3, 2, 6, 8, 5, 0, 6, 0, etc... (used a random number generator)
Things so complex that it seems impossible to represent, but is it possible?:
1) The first two numbers in the sequence are 0, then 1.
2) The needed for the next number in the sequence is equal to the number of (sample) standard deviations of the current data (or sample, set, etc.? I don't know the exact terminology) for 50% of the data to fall under (about .67, but I want the exact value to be used), then add the current average of the current data (or etc.). In other words, about .67 * Standard Deviation + Average.
3) The value in Step#2 is rounded up if it contains decimals (it always will, won't it?), then that final number is the next term in the sequence.
NOTE: This is actually something I'm trying to figure out in an application in real life, but I'm more concerned about the math part than what the application is. xD
The Digits of Pi: 3, 1, 4, 1, 5, 9, 2, 6, 5, etc...
The Digits of e: 2, 7, 1, 8, 2, 8, 1, 8, 2, etc...
"Weird" (in terms of math) Patterns that seem simple: 1, 2, 3, 2, 1, 2, 3, 2, 1, etc... OR 1, 1, 2, 3, 5, 8, 13, etc... OR 1, 3, 2, 4, 3, 5, 4, 6, etc...
...and anything else I missed, or that you find interesting enough to share about.
2) Can there be an equation written for any pattern?
Pretty much the same as Question#1, but regarding equations rather than functions (no piece-wise, etc.)
3) Is there a way to calculate "randomness", or how random something is?
Because everything in real life has a causation (as far as I know, and I'll be asking about this in the next question), nothing in real life is truly random.
I noticed that in terms of randomness, it has two traits:
1) The time it takes for this "random" sequence to repeat its pattern (although the values may not repeat: e.g. 1, 1, 2, 2, 3 <-- pattern repeats here.).
2) The actual complexity of the pattern, which makes it more unpredictable. (e.g. 1, 1, 1, 1, 1, etc... is more simple than 1, 2, 1, 2, 1, 2, 1, 2, etc...).
Also, I've noticed that, in patterns, it's really just a combination of a pattern, then "pattern breaks" (as I like to call it), where there is a pattern for when the original pattern changes. I've noticed that something appears more "random" if they have more "pattern breaks", longer sequences, and more complex patterns that aren't accounted for in the "pattern breaks" (Are there patterns that can't be created from "pattern breaks"?)
So getting back to my question: Assuming you know the pattern, how do you calculate its "randomness"? Assuming you don't know the pattern, and you can produce any amount of data from it, how large would the sample have to be, and how do you calculate its "randomness"?
4) Can something be truly random?
Everything in life has a causation, so this leads me to two scenarios if true randomness exists:
1) There exists things in life that have no causation (The only thing I can think of that would fit that is God if He exists)
2) There exists things whose causations either cause them to be random, or lacks characteristics that prevents it from being random.
5) I definitely want to apply this knowledge to produce "levels" of randomness in a game, or a computer AI that is able to analyze, estimate, and even predict the pattern(s) of the gamer. Is this possible in terms of math, and if so, is this possible in terms of programming? (starting to get off-topic, but this is definitely the main application I have for the topic of patterns and randomness).
Pretty much, I want to possibly create computer AI's whose actions seems to be "human-like" instead of being rigid and robotic.
If you are playing mahjong or poker against a computer, will it be able to predict your patterns if you yourself aren't "mixing it up" or being "random" enough? I know in statistics, you can guess based on the end results, like 30% of the time, he does X in Y scenario, but that doesn't account for the causations. For example, one could have equal probability to do any actions in all scenarios, but different people might do different actions over time (e.g. 1, 2, 1, 2, 1, 2, etc... results in 50% of the time doing action 1, but 2, 1, 2, 1, 2, 1, etc... is also 50% of the time, or even 1, 1, 2, 2, 1, 1, 2, 2, etc...).
I want to mention other things, but it's concepts that I want to keep for my video game designing in the future, and I could figure them out if I knew the information above.
Thanks in advance! If you need anything clarified, please ask! (this is kinda hard to explain since I don't know the terminology that goes with it).