PiscesMike is right. It was more about recognizing a pattern in the number of stars that get printed out on each line.

One way you could come up with this is by making a table like this:

**Row** |
0 |
1 |
3 |
4 |
5 |
6 |
n |

**Number** |
1 |
3 |
5 |
7 |
9 |
11 |
n * 2 + 1 |

When you look at this table (aside from the last column) you can see that each row increases by 2. So I know I'm going to have something like 2 * n + (something). My initial value is 1, so (something) becomes 1, and I get my generic form of 2 * n + 1 for any particular row.

I believe the term that PiscesMike was searching for is that this is an arithmetic progression. Math-related Wikipedia pages seem to be designed for mathematicians, not normal people, so the content in that link is bound to be a little confusing, but that's the idea.

The name sounds scarier than it really is. It's just simply a series of numbers where you start at some value (1 in this case) and add a certain number in each iteration (2 in this case). In this case, an arithmetic progression can be generalized into that form of a * n + b, where **b** is the starting value and **a** is the amount that you increase in each step. For any given row, you just put the row number in for **n** and you'll be able to determine how many stars there would be on that row from that little equation.