≡

wincent.dev

  • Products
  • Blog
  • Wiki
  • Issues
You are viewing an historical archive of past issues. Please report new issues to the appropriate project issue tracker on GitHub.
Home » Issues » Feature request #1131

Feature request #1131: Add in-browser (Selenium) tests

Kind feature request
Product wincent.dev
When Created 2008-09-09T16:13:50Z, updated 2009-11-26T16:52:06Z
Status closed
Reporter Greg Hurrell
Tags no tags

Description

Bug #1130 crept in because of the lack of in-browser tests (eg. Selenium-driven tests). This was a JavaScript bug and so that's the only way that it could be caught.

Comments

  1. Greg Hurrell 2008-09-11T15:13:12Z

    It's worth clarifying that the existing Webrat integration tests won't cut it here, because Webrat doesn't handle JavaScript either.

  2. Greg Hurrell 2009-04-24T10:16:47Z

    The ball is now rolling; see "Installing Selenium 1.1.14 on Mac OS X 10.5.6 Leopard". I am trying to follow the recipe detailed here under "Cucumber / Selenium / Webrat".

  3. Greg Hurrell 2009-04-24T11:16:42Z

    This (on getting Selenium to work with self-signed SSL certificates) is likely to prove useful.

  4. Greg Hurrell 2009-04-24T11:37:15Z

    Starting to look like this could be a real pain for the following reasons:

    • For security the app is designed to run on SSL
    • Mongrel itself doesn't speak SSL, so I have to set up an nginx proxy in front of Mongrel to make it work
    • Again for security, the app is set up to forward non-SSL requests to the equivalent SSL address
    • All the routes in the app are set up to expect and generate SSL URIs
    • In local testing (host "localhost"), I'm using self-signed certificates for obvious reasons
    • Selenium runs Firefox in a clean environment (no persistent preferences) and so it's not possible to add an exception for the self-signed certificate and have it stick
    • The recipe I've followed notes that you don't have to (can't) launch the Selenium server manually (which would be one way to specify a custom Firefox profile as mentioned here), because Webrat will apparently spawn a new Selenium server and Mongrel process for you.
    • It's undesirable to add test-environment-only (or Selenium-environment-only) modifications to work around this SSL behaviour, because the whole point if integration tests in the browser is to test the code that you're planning on deploying and not some mutated version of it.
    • Even if it were desirable it would be hideously complex, because the entire application is engineered with the SSL requirement in mind (not just at the "before filter" level, but every single route in the application).

    So it's starting to look pretty grim. Will have to think some more about it.

  5. Greg Hurrell 2009-04-28T05:08:46Z

    Random thoughts on the problem:

    • might be able to use the local_request? controller method to allow local requests to use non-SSL behaviour; this feels somehow less kludgey than doing if ENV['RAILS_ENV'] == 'test'
    • this should be enough to make Cucumber/Selenium work without any of the certificate hassles
    • it might also allow me to remove my ugly ssl? overrides in spec/spec_helper.rb and features/support/env.rb (not sure if local_request? returns true in the test environment)
    • on the other hand, I have a bunch of places in the app where I set or expect :protocol => 'https'
    • so with this change will route recognition and generation be broken?
    • will I need to back out all that stuff and use _path helpers instead of _url helpers everywhere?
    • and is backing it out going to be easy or possible?; the real problem are things like Atom feeds which are cached and so must embed the correct URL (otherwise one rogue client connecting via HTTP instead of HTTPS will pollute the cache with non-SSL URLs)
    • outgoing mails as well need proper HTTPS URLs, although that might not be such a problem seeing as I can generally pull the host from the application config rather than the request environment

    One of the problems is that Mongrel doesn't support SSL, so we're forced to use a nginx/mongrel proxy arrangement. It might be worth trialling Passenger seeing as that would allow me to take Mongrel out of the loop entirely. See this page for instructions on setting up Passenger and nginx with SSL support. That in turn might make the whole Selenium issue go away (although not yet... evidently the problem remains that the Selenium setup described above auto-spawns a mongrel to handle requests).

  6. Greg Hurrell 2009-04-28T12:19:11Z

    More random thoughts:

    I am beginning to think that my initial approach (trying to enforce SSL access at the application level) was completely misguided. (I was probably seduced by the example of the "SSL Requirement" plugin.) I don't think this is an application-level problem; it is a firewall-level problem. At least, that is true in the case of this application because SSL should apply across the entire application and not just specific controllers/actions.

    So:

    • the Mongrel instances should listen on high port numbers (eg. 12345)
    • the OS firewall should prevent non-local connections to those ports; this means that nobody can connect directly to the mongrels
    • external visitors should connect only to the nginx front-end
    • nginx can proxy the connections to the mongrel instances because it is on the same host (in the case of multiple machines the firewall rules could be modified accordingly)
    • if nginx does the right thing, only SSL connections will be forwarded to the back-end; all others will be redirected from the HTTP URL to the HTTPS equivalent
    • RSpec specs and Cucumber features should run fine and be able to connect to the local Mongrel instances because they are running on the same host
    • because of all this, the application shouldn't do any ssl? checks and the ensure_correct_protocol "before filter" shouldn't even exist
    • the routes file should be free of all :protocol => 'https' declarations; in fact, pretty much the entire codebase should be free of them
    • to ensure correct URL generation we should always use _path helpers rather than _url helpers
    • may or may not need to combine this idea with the use of local_request?, but my initial gut reaction is that it won't be necessary
    • outgoing mails should use correct HTTPS URLs because they should pull host information from the application config rather than from the request environment
    • things like Atom feeds still need absolute URLs, and they need to be HTTPs URLs, but I am not sure how that is going to pan out until I try it
    • theoretically, Selenium should work as well, even when it autospawns a Mongrel instance
    • if I still can't get Selenium to play ball due to the autospawned Mongrel problem, I should look into Selenium-RC and see if I can get things working via that route
    • the codebase should be a little simpler because we've just delegated the role of protocol enforcement to an external layer
  7. Greg Hurrell 2009-04-28T13:46:59Z

    For some notes on firewall configuration, see "iptables".

  8. Greg Hurrell 2009-04-28T14:52:50Z

    One thing to note: without even setting up firewall rules, I can probably just train the Mongrels to bind to the loopback interface instead of a public IP address.

  9. Greg Hurrell 2009-04-28T15:37:06Z

    Ok, as an initial first step, that's exactly what I've done (make the Mongrels bind to 127.0.0.1, and train both nginx and monit to look for them there).

  10. Greg Hurrell 2009-04-28T18:22:57Z

    I've done a little experimentation and it looks like I won't be able to do non-SSL tests with Selenium after all.

    I basically went through and:

    • changed all appropriate _url helpers to _path helpers
    • dropped the ensure_correct_protocol "before filter"
    • dropped the ssl? hacks in spec/spec_helper.rb and features/support/env.rb

    I guess in itself these changes are worthwhile, but alas, they don't really help me with Selenium because for many things I still can't access the Mongrel instance directly using HTTP. This is because the login cookie is set to "Secure" and that means you can login but you still end up surfing the site as an anonymous guest because the cookie doesn't get sent in with your requests.

    Workarounds?

    • drop the "Secure" setting on the cookie, thus losing a layer of "defense in depth"?
    • drop the "Secure" setting only if local_request? returns true?

    The second option might be worthwhile, as this is really all about local testability. Will sleep on it.

  11. Greg Hurrell 2009-04-29T06:26:40Z

    Found some interesting docs on Selenium and HTTPS here:

    Support for HTTPS

    Because of the unique proxy-based nature of Selenium RC, tests that require HTTPS can be a challenge. Our proxy attempts to examine and modify the HTTP requests being sent over the wire; HTTPS is explicitly designed to prevent anyone from doing that (even though, in our case, our intentions are benign).

    Therefore, our proxy is configured to automatically generate dummy SSL certificates whenever you browse to an HTTPS site. Our dummy certificates are automatically signed by Selenium RC using a made-up certificate authority (CA) called "CyberVillains"; if your browser is configured to trust the CyberVillains CA, it will automatically trust our dummy certificates, allowing us to read/modify the content of your SSL requests.

    Using the CyberVillains CA is inherently dangerous. When you use a browser that trusts CyberVillains, any website can claim to be any other website for the purposes of SSL authentication. Don't use Selenium RC while browsing to websites that you don't trust.

    On Firefox and Opera, we can automatically configure the browser to temporarily trust CyberVillains, but we don't yet provide support for this on Internet Explorer, Safari, or Konqueror. If you attempt to browse to an HTTPS website using one of those browsers without manually trusting the CyberVillains CA, you'll get a security warning dialog, which, unfortunately, you won't be able to use Selenium to click through. For those browsers, see the instructions below that explain how to use the CyberVillains CA manually.

    Another thing that makes HTTPS challenging is that many sites that require HTTPS use it only for certain pages (e.g. only for the login page), which means that testing such sites requires switching between different domains, in violation of the same-origin policy.

    If your test needs to use multiple domains (HTTP and HTTPS in the same test), your best choices are the *chrome or *iehta browser launchers, alternatively you can use proxy injection mode. If you use *chrome or *iehta, you don't need to use the Selenium proxy at all, in which case you don't have to trust the CyberVillains CA. If you use proxy injection mode, you'll still have to trust CyberVillains, following the directions below.

    Using the CyberVillains CA manually

    Did I mention that using the CyberVillains CA is inherently dangerous? Using it manually is more dangerous, because it introduces the risk that you'll forget to uninstall the CyberVillains CA when you're done using Selenium RC. You MUST manually uninstall the CyberVillains CA after you're finished using Selenium RC.

    Windows/IE: Look for "cyberVillainsCA.cer" in the "server" folder in the selenium-remote-control zip, double-click on it, and press the "Install Certificate" button. Then click "Next" a few times, and finally "Finish" to install the CA certificate. To uninstall the CA certificate, use the "Internet Options" Control Panel, click on the "Content" tab, and click on the "Certificates" button. Finally, under the "Trusted Root Certification Authorities" tab, scroll down to CyberVillains and click on the "Remove" button to remove the cert.

    OS X/Safari: Look for "cyberVillainsCA.cer" in the "server" folder in the selenium-remote-control zip and double-click on it. You'll be prompted to select a keychain; select the "X509Anchors" keychain. You'll be required to enter your administrative password to install the certificate. To uninstall the CA certificate, start the "Keychain Access" program, in "/Applications/Utilities". Click on "Show Keychains" and select the X509Anchors keychain. If that keychain isn't available, select "Add Keychain" from the File menu, then navigate to /System/Library/Keychains and open the "X509Anchors" file in that directory. Once you've opened the X509Anchors keychain, you should be able to scroll down to the CyberVillains CA and delete it using the Delete button.

    Linux/Konqueror: In the Settings menu, choose "Configure Konqueror." Select the "Crypto" icon on the left-hand navigation bar, then select the "SSL Signers" tab by clicking on the scroll-right arrow in the upper-right corner of the window. Click on the Import button to import a certificate. Look for "cyberVillainsCA.cer" in the "server" folder in the selenium-remote-control zip and double-click on it to import the certificate. To uninstall the certificate, just click on "CyberVillains" in the "SSL Signers" tab and click on the "Remove" button.

  12. Greg Hurrell 2009-04-29T06:29:38Z

    Related notes on using the "CyberVilains CA":

    • http://clearspace.openqa.org/thread/11973
  13. Greg Hurrell 2009-04-29T06:36:26Z

    From the page I just linked to:

    As I was told from some forums, I had to start Selenium server from a command line:

    D:\selenium-remote-control-0.9.2\selenium-server-0.9.2>java -jar selenium-server.jar -interactive -port 60000

    So using this file: "selenium-remote-control-0.9.2\selenium-server-0.9.2\sslSupport\ cybervillainsCA.cer" the browser would accept by default any certificate. automatically signed by Selenium RC using a made-up certificate authority (CA) called CyberVillains".

    I don't see anything in there instructing Selenium to automatically install the "CyberVillians" certificate and indeed when I run the feature suite and look in the Firefox preferences I can confirm that there is no such certificate present. So there must be something else required which isn't documented on the tutorial page.

  14. Greg Hurrell 2009-04-29T12:16:59Z

    Looks like this could be the answer:

    You can tell webrat to not launch Selenium RC and to instead connect to an already running instance by setting the selenium_server_address configuration variable. It's not well documented (it was originally added to support Selenium Grid) and is perhaps not an ideal solution to the problem you face, but it might be preferable to monkey-patching webrat.

  15. Greg Hurrell 2009-04-29T13:07:08Z

    It works, at least partially...

    Manually start up the Selenium server:

    java -jar /Library/Ruby/Gems/1.8/gems/Selenium-1.1.14/lib/selenium/openqa/selenium-server.jar.txt

    Start up the Mongrel instance on the default port 3000:

    script/server

    Tell Webrat to use the existing Selenium server rather than auto-launching a new one:

    Webrat.configure do |config|
      config.mode = :selenium
      config.application_port = 3000 # 3001 is the default
    
      # connect to already-running Selenium RC instance
      config.selenium_server_address = '127.0.0.1'
      config.selenium_server_port = 4444 # the default
    end

    And run the features:

    cucumber -p selenium

    At the moment they fail, in need of tweaks, but at least they run.

  16. Greg Hurrell 2009-11-26T16:52:01Z

    I now have a decent test rig set up for this using Cucumber, Capybara and Culerity.

    The feature "suite" includes a test to make sure that issues like bug #1130 don't crop up again.

    Marking this one as closed.

  17. Greg Hurrell 2009-11-26T16:52:06Z

    Status changed:

    • From: open
    • To: closed
Add a comment

Comments are now closed for this issue.

  • contact
  • legal

Menu

  • Blog
  • Wiki
  • Issues
  • Snippets