Getting Around Internet Explorer’s 4,096 CSS Rule Limit

Browser incompatibilities are not quite my favorite bugs to find.

I recently got a bit more gray hair when I found out that Internet Explorer can’t handle more than 4,095 (2 ^ 12 – 1) selectors in a single CSS file. This is easy to run into when you’re using a framework like Rails’ asset pipeline or Compass where it’s so easy to make a single large CSS file.

In my application, the root cause of such large CSS files can be the combinatorial effect of selectors. For example, this “SCSS”:http://sass-lang.com/ will generate an explosion of rules:

.c1 .c2, .c1 .c3 {
  .c4 .c5, .c4 .c6 {
    .c7 .c8, .c7 .c9 {
      font-size: 100%
    }
  }
}

And the compiled CSS:

.c1 .c2 .c4 .c5 .c7 .c8,
 .c1 .c2 .c4 .c5 .c7 .c9,
 .c1 .c2 .c4 .c6 .c7 .c8,
 .c1 .c2 .c4 .c6 .c7 .c9,
 .c1 .c3 .c4 .c5 .c7 .c8,
 .c1 .c3 .c4 .c5 .c7 .c9,
 .c1 .c3 .c4 .c6 .c7 .c8,
 .c1 .c3 .c4 .c6 .c7 .c9 {
    font-size: 100%;
}

We’ve multiplied three straightforward lines into 8 rules. A few of these in your CSS can quite easily put you over the limit. I’ve found more than one solution, but nothing I’ve really felt confident in. I spiked a quick rack-pagespeed filter to try to automatically solve the problem, but a short investigation revealed that this avenue is not going to be a trivial effort.

In the end, I did the simple thing: I just broke it up into two files. For a longer-term preventive measure, I added an acceptance spec (written with the steak framework) that will grab the compiled CSS files and alert me if any file exceeds the limit. The CSS parsing and counting is based on the CssSplitter code found “here.”:https://gist.github.com/1131536

Here’s my spec:

feature 'Stylesheets' do
  scenario 'are all smaller than the Internet Explorer maximum' do
    stylesheets = []

    #### This will examine the HTML at the given URL and check the stylesheets
    #### linked therein. Add similar calls as needed to visit pages that expose
    #### all of your site's CSS
    stylesheets += stylesheets_at_url('/')


    stylesheets.each do |stylesheet|
      visit stylesheet
      count = count_selectors(source)

      # puts "#{stylesheet}: #{count}"

      count.should be_<(4096), "#{stylesheet} contains #{count} selectors, which exceeds Internet Explorer maximum of 4096"
    end
  end

  def stylesheets_at_url(url)
    visit url

    page.all('link[rel="stylesheet"][href]').map do |node|
      node['href']
    end
  end

  def count_selectors(css)
    css.lines("}").inject(0) { |memo, rule| memo += count_selectors_of_rule(rule) }
  end

  def count_selectors_of_rule(rule)
    rule.partition(/\{/).first.scan(/,/).count.to_i + 1
  end
end

It isn't a perfect solution — I'd prefer if Sprockets just handled everything for me (it looks like people have worked on this, but I didn't find anything polished). Even if it doesn't fix the problem, it's good to get a big red flag whenever it rears its ugly head.

Has anyone out there implemented a better solution?

Conversation
  • nice solution – simple and to the point.

    • Here is a javascript snippet that tells you how many selector has each page

      https://gist.github.com/4586719

      Elie

      • Mike Swieton Mike Swieton says:

        Hi Elie,

        That’s cool! It’s good to see such a short solution. It’s also nice to see it in JavaScript – it means we could update the capybara test to just run the JS in the browser, which I suspect would be more reliable than using a handful of regular expressions (which always feels brittle to me, even if it hasn’t broken yet).

        Thanks for posting that!

  • Comments are closed.