Hypotheses when both data and prior knowledge are available


Consider again the question from Section 12.2.1, in Tom Mitchell regarding what criterion to usefor choosing among hypotheses when both data and prior knowledge are available.Give your own viewpoint on this issue.

13.3. Consider playing Tic-Tac-Toe against an opponent who plays randomly. In particular,assume the opponent chooses with uniform probability any open space, unlessthere is a forced move (in which case it makes the obvious correct move).
(a) Formulate the problem of learning an optimal Tic-Tac-Toe strategy in this case
as a Q-learning task. What are the states, transitions, and rewards in this nondeterministic
Markov decision process?
(b) Will your program succeed if the opponent plays optimally rather than randomly?

Problem 3: (Genetic Algorithms; 40 points total)

Consider a hypothesis space made of bit-strings of length 20. You are trying to learn a concept over this hypothesis space; this concept corresponds to the subset of the strings that are the positive instances of the concept. For each new 20-bit string you generate (using GA), the teacher tells you if it's a '+' or a '-' instance of the concept. Therefore, you can only learn the concept with certainty if you can generate all possible bit strings.
(a) How many bit strings are there? How many distinct concepts are possible over that number of bit-strings?
Now (for parts b - d of this problem) assume your initial population is made of the following four strings: 
{x1 = 110010, x2 = 1100801, x3 = 1100810, x4 = 1100811}
Assume further that each generation is made of exactly four strings, where the next generation is obtained from the previous one by some combination of the selection, crossover and (when available) mutation.

(b) Assume you use the crossover operation where you combine the first ten bits of one string in the current generation with the last ten bits of another string, like this:

X(1-10)X(11-20) and Y(1-10)Y(11-20) crossed over yield X(1-10)Y(11-20) and Y(1-10)X(11-20), 

where X(1-10) is a short-hand for the first 10 bits of the original string X, and similarly for the other substrings. You only have some form of selection, and this particular form of crossover, available to you to generate new configurations (i.e., no mutation). Is this restricted version of GA, in general, going to be able to learn an arbitrary concept over the set of 20-bit strings?

(c) Now assume you also have random mutation that you can apply to any of the last 5 bits of any string. Does the mutation make a difference insofar as GA's learning ability in this setting? Can you learn more concepts that before? Can you learn an arbitrary concept (that is, answer the question from part (b) but with the modified version of GA that includes restricted mutation).

(d) Now assume you can apply random mutation to any bit in any string (i.e., now GA has unrestricted mutation at its disposal). Answer the questions from parts (b-c) for this setting. Explain your answer carefully regarding whether an arbitrary concept can be learned if GA has unrestricted mutations available. 

Request for Solution File

Ask an Expert for Answer!!
Basic Computer Science: Hypotheses when both data and prior knowledge are available
Reference No:- TGS0144109

Expected delivery within 24 Hours