RegEx length matching going wrong?

Regular Expressions are not quite my forte and I am having some trouble understanding just why the following is happening

function showMatch(id)
{
 const spn = document.getElementById(id);
 const value = document.getElementById(`${id}Test`).innerText;
 
 let result = /^[a-zA-Z{1}[a-zA-Z0-9_s&]{2,17}$/.test(value);
 
 spn.innerHTML = result;
}

showMatch('one');
showMatch('two');
showMatch('three');
showMatch('four');
div > span:last-of-type
{
 padding-left:2rem;
 font-weight:bold;
} 
<div><span id='oneTest'>S</span><span id='one'></span></div>
<div><span id='twoTest'>St</span><span id='two'></span></div>
<div><span id='threeTest'>Sta</span><span id='three'></span></div>
<div><span id='fourTest'>Stac</span><span id='four'></span></div>

The regex I am using here is /^[a-zA-Z{1}[a-zA-Z0-9_s&]+$/ with the intent being as follows

  • Require the first character to be alphabetic
  • Require the subsequent characters to be alphanumeric or a space or an ampersand
  • Require the total length to be between 3 and 18 characters

I have been assuming that [a-zA-Z]{1} would consume the first character which is why the second part tests for a length of between 2 and 17 characters. Clearly, that is not how it is working or else my second test value St would not yield TRUE. How can this be corrected?