I suspect in your test case that the search result portlet may be limiting the number of visible rows to a slice of the total set or rows? If so, then the client-side performance may not be impacted by a large result set since you are only ever seeing a slice of the total set of rows at any time in the browser.
As for the server-side performance – Depending on which type of task search content provider you are using, your application may be consuming a lot of server-side memory since the framework might have to load the entire set of rows into a (in memory) list to sort and limit it before the visible rows are rendered for the client. If you are testing on a server that is not under load or memory constrained, you may not notice much impact. However, if you have lots of users interacting with this page at the same time, you may run out of memory eventually.
If you want to avoid extra memory consumption related to sorting the results, then you would want to make sure you are utilizing the “indexed search content provider” capability since, in that implementation it would do the sorting in database SQL query so you don’t have to load all the rows into memory to sort it.
The most dangerous scenario for the client-side performance is combining a large (or unbounded) max results from the search bar with a search result portlet that is configured with the “Number of Rows to Display” preference specified as “Show All”. In that case you are rendering all the rows at once and not utilizing the large-list paging capabilities of the table control to limit the visible part to a smaller slice.
In any case, the best practice to guard against excessive server-side memory consumption and client-side browser performance would be limit the result set to a reasonable max size.