2.2.2 Computational Methods

Download: 2.2.2 Computational Methods

Computable Problems:

A problem is defined as being computable if there is an algorithm that can solve every instance of it in a finite number of steps.

Computational methods could be used to break the problem down into sections:

  • Enumeration (listing all cases)
  • Theoretical approach
  • Creative solution.
  • Simulation

Enumeration:

Many problems can be solved by exhaustive search. This is trying all possible solutions until the correct one is found. This is extremely inefficient as the number of possible solutions grows exponentially as the size of the problem increases

Simulation:

This is the process of designing a model of a real system in order to understand the behaviour of the system, and to evaluate various strategies for its operation. Example problems:

  • Population predictions
  • Climate Change predictions
  • Engineering design problems

Simulation makes use of abstraction in order to reduce the problem down to its essentials by removing all unnecessary details.

Problem Recognition:

  When it comes to recognise a problem, we have to be able to answer these questions:

  • Outcomes: What do we want to happen (or not happen)? What output will provide us with the answer we need? What does success look like?
  • Inputs: What data will be input? What data is available? What data do we already know? How will we get this data into the computer?
  • Scope & Resources: What are our limitations, restrictions, deadlines? Who is responsible? Who will sign off the finished product?

A problem is solvable by computational methods if we can clearly define:

  • What inputs are available
  • What outputs are required
  • What logical steps will we get from the available inputs to the right output
  • Input, outputs and logical steps can be set out in an algorithm.

Time is also an important factor.

  • If a problem can be solved in a finite number of steps, then it is solvable by computational methods.
  • If a problem takes an infinite numbers of steps, then it is not solvable by computational methods
  • If solving a problem has a finite numbers of steps but that number is very large and it would take too long for even a computer to solve, then we call this problem solvable but intractable.

Problem Decomposition:

Decomposition mean breaking a complex problem down into smaller, more manageable sub-problems. Each smaller part can then be solved individually, before all the sub-solutions are combined to solve the original problem.

This allows large teams to each take part of a problem and solve it. Decomposition allows seemingly impossible problems to be solved by splitting them into simple tasks.

Problem Abstraction:

Problem Abstraction involves removing details until the problem is represented in a way that it is possible to solve because it reduces to one that has already been solved.

Divide & Conquer:

Divide & Conquer is a very powerful technique which essentially reduces the size of a problem with every iteration. This is used within the Binary Search, which halves the size of the problem with each iteration. Other problems may be tackled in this way but do not necessarily reduce the problem so fast.

Backtracking:

In some problems, in order to find a solution, you have to make a series of decisions, but there may be case for which:

  • You don’t have enough information to know which is the best choice.
  • Each decision leads to a new set of choice.
  • One or more of the sequences may be a solution to the problem

Backtracking is a methodical way of trying out different sequences until you find out one that leads to a solution.

This is done by exploring all the different possibilities. They will build up like a ‘tree’ of possible moves. Each time you make a choice, it will create a new ‘branch’. If you get to a dead end, go back to the last valid move. Eventually you will find a successful branch.

Data Mining:

This is the process of digging through massive data sets in order to discover hidden connections and possible predictions in future trends, anomalies and exceptions, and correlations between factors.  This will typically involve the use of different kinds of software packages such as analytics tools.

Big data is the term used for large sets of data that cannot be easily handled in a traditional database.

Heuristics:

Heuristic is a technique designed for solving a problem more quickly when classic methods are too slow, or finding an approximate solution when classic methods fail to find any exact solution.

This uses a more or less a ‘educated guess’ approach to arrive at a solution when it is unfeasible to analyse all eventualities. It can a produce a ‘good enough’ solution although the result is not 100% reliable. It is best not to rely too much on Heuristics for Life-and-Death scenarios.

It is useful when you need a fast result and the result doesn’t have to be precise.

However, Heuristics are used to assess potential malware:

  1. Looks at behaviour rather than structure so can uncover suspicious activity even if produced in a novel way
  2. Examines the susceptibility of the system to possible attacks
  3. Simulate the possible effects of a suspected virus
  4. Sometimes decompile the suspicious program, then analyse the resulting source code.

Pattern Matching:

The computer learns a pattern. It can then recognise data which matches this pattern. This depends on having a database of patterns. Pattern Matching may not be good at dealing with new and unexpected data.

Pipelining:

Pipelining is the technique of splitting tasks into smaller parts and overlapping the processing of each part of the task.

While one instruction is being fetched, another is being decoded and a third, executed.

Performance Modelling:

Performance Modelling is the process of simulating different user and system loads on a computer using a mathematical approximation, rather than doing actual performance testing which may be difficult and expensive.

It can be used in a simulation to test an algorithm. You then try out as many scenarios as you can and making sure you use a wide range of test data including extreme conditions. This is safer, cheaper and quicker than running the program for real.

Visualisation:

Visualisation is presenting data in a form which the user can understand.

Graphs, charts and maps are all examples of visualisation. We can use this to spot trends and patterns in data which is presented visually.

Intractable Problems:

Some problems are termed intractable because although an algorithm may exist for their solution, it would take an unreasonably long time to find the solution.

 291 total views,  1 views today