Did some simple benchmarking of a few things.
PHP:
// Average over 100,000 iterations: 10.5ms
$a = $_GET['a'];
// Average over 100,000 iterations: 27.6ms
$a = htmlentities($_GET['a']);
// Average over 100,000 iterations: 51.4ms
$a = filter_input(INPUT_GET, 'a', FILTER_SANITIZE_STRING);
// Average over 100,000 iterations: 56.8ms
$a = filter_input(INPUT_GET, 'a', FILTER_UNSAFE_RAW);
Actual benchmark code:
PHP:
<?php
$start = microtime();
for($i = 0; $i < 100000; $i++) {
$a = $_GET['a'];
}
$end = microtime();
$result = round($end - $start, 4);
echo 'Time Elapsed: ' . $result;
?>
To get the numbers, I benchmarked each line separately 10 times to get the average. Converted the number, in seconds, to milliseconds.
I would rather use code that is quicker both in execution and when actually writing the code, wherein I can read it more easily and I and everyone else reading the code actually knows exactly how it is secured; not to mention some strings don't need, or perhaps will not work even correctly with certain types of supposed security.
/2cents