TwistedWeb2-8.1.0/0000755000175000017500000000000011014056216012331 5ustar dokodokoTwistedWeb2-8.1.0/LICENSE0000644000175000017500000000301010772221022013327 0ustar dokodokoCopyright (c) 2001-2008 Allen Short Andrew Bennetts Apple Computer, Inc. Benjamin Bruheim Bob Ippolito Canonical Limited Christopher Armstrong David Reid Donovan Preston Eric Mangold Itamar Shtull-Trauring James Knight Jason A. Mobarak Jonathan Lange Jonathan D. Simms Jp Calderone Jürgen Hermann Kevin Turner Mary Gardiner Matthew Lefkowitz Massachusetts Institute of Technology Moshe Zadka Paul Swartz Pavel Pergamenshchik Ralph Meijer Sean Riley Travis B. Hartwell Thomas Herve Eyal Lotem Antoine Pitrou Andy Gayton Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. TwistedWeb2-8.1.0/doc/0000755000175000017500000000000011014056216013076 5ustar dokodokoTwistedWeb2-8.1.0/doc/howto/0000755000175000017500000000000011014056243014236 5ustar dokodokoTwistedWeb2-8.1.0/doc/howto/object-traversal.html0000644000175000017500000007224511014056243020405 0ustar dokodokoTwisted Documentation: Twisted.Web2 Object Traversal

Twisted.Web2 Object Traversal

  1. Object Traversal Basics
  2. locateChild in depth
  3. childFactory method
  4. child_* methods and attributes
  5. Dots in child names
  6. The default trailing slash handler
  7. IRequest.prepath and IRequest.postpath
  8. Conclusion

Object traversal is the process Twisted.Web2 uses to determine what object to use to render HTML for a particular URL. When an HTTP request comes in to the web server, the object publisher splits the URL into segments, and repeatedly calls methods which consume path segments and return objects which represent that path, until all segments have been consumed. At the core, the Web2 traversal API is very simple. However, it provides some higher level functionality layered on top of this to satisfy common use cases.

Object Traversal Basics

The root resource is the top-level object in the URL space; it conceptually represents the URI "/". The Twisted.Web2 object traversal and object publishing machinery uses only two methods to locate an object suitable for publishing and to generate the HTML from it; these methods are described in the interface twisted.web2.iweb.IResource:

class IResource(Interface):
  """
  I am a web resource.
  """

  def locateChild(request, segments):
    """Locate another object which can be adapted to IResource.

    return: A 2-tuple of (resource, remaining-path-segments),
                 or a deferred which will fire the above.
    
                 Causes the object publishing machinery to continue on
                 with specified resource and segments, calling the
                 appropriate method on the specified resource.

                 If you return (self, L{server.StopTraversal}), this
                 instructs web2 to immediately stop the lookup stage,
                 and switch to the rendering stage, leaving the
                 remaining path alone for your render function to
                 handle.
    """

  def renderHTTP(request):
    """Return an IResponse or a deferred which will fire an
    IResponse. This response will be written to the web browser
    which initiated the request.
    """

Let's examine what happens when object traversal occurs over a very simple root resource:

from twisted.web2 import iweb, http, stream

class SimpleRoot(object):
    implements(iweb.IResource)

    def locateChild(self, request, segments):
        return self, ()

    def renderHTTP(self, request):
        return http.Response(200, stream=stream.MemoryStream("Hello, world!"))

This resource, when passed as the root resource to server.Site or wsgi.createWSGIApplication, will immediately return itself, consuming all path segments. This means that for every URI a user visits on a web server which is serving this root resource, the text "Hello, world!" will be rendered. Let's examine the value of segments for various values of URI:

/foo/bar
  ('foo', 'bar')

/
  ('', )

/foo/bar/baz.html
  ('foo', 'bar', 'baz.html')

/foo/bar/directory/
  ('foo', 'bar', 'directory', '')
    

So we see that Web2 does nothing more than split the URI on the string '/' and pass these path segments to our application for consumption. Armed with these two methods alone, we already have enough information to write applications which service any form of URL imaginable in any way we wish. However, there are some common URL handling patterns which Twisted.Web2 provides higher level support for.

locateChild in depth

One common URL handling pattern involves parents which only know about their direct children. For example, a Directory object may only know about the contents of a single directory, but if it contains other directories, it does not know about the contents of them. Let's examine a simple Directory object which can provide directory listings and serves up objects for child directories and files:

from twisted.web2 import resource

class Directory(resource.Resource):
    def __init__(self, directory):
        self.directory = directory
    
    def renderHTTP(self, request):
        html = ['<ul>']
        for child in os.listdir(self.directory):
            fullpath = os.path.join(self.directory, child)
            if os.path.isdir(fullpath):
                child += '/'
            html.extend(['<li><a href="', child, '">', child, '</a></li>'])
            
        html.append('</ul>')
        html = stream.MemoryStream(''.join(html))
        return http.Response(200, stream=html)

    def locateChild(self, request, segments):
        name = segments[0]
        fullpath = os.path.join(self.directory, name)
        if not os.path.exists(fullpath):
            return None, () # 404

        if os.path.isdir(fullpath):
            return Directory(fullpath), segments[1:]
        if os.path.isfile(fullpath):
            return static.File(fullpath), segments[1:]

Because this implementation of locateChild only consumed one segment and returned the rest of them (segments[1:]), the object traversal process will continue by calling locateChild on the returned resource and passing the partially-consumed segments. In this way, a directory structure of any depth can be traversed, and directory listings or file contents can be rendered for any existing directories and files.

So, let us examine what happens when the URI "/foo/bar/baz.html" is traversed, where "foo" and "bar" are directories, and "baz.html" is a file.

  1. Directory('/').locateChild(request, ('foo', 'bar', 'baz.html')) - Returns Directory('/foo'), ('bar', 'baz.html')
  2. Directory('/foo').locateChild(request, ('bar', 'baz.html')) - Returns Directory('/foo/bar'), ('baz.html, )
  3. Directory('/foo/bar').locateChild(request, ('baz.html')) - Returns File('/foo/bar/baz.html'), ()

No more segments to be consumed; File('/foo/bar/baz.html').renderHTTP(ctx) is called, and the result is sent to the browser.

childFactory method

Consuming one URI segment at a time by checking to see if a requested resource exists and returning a new object is a very common pattern. Web2's default implementation of twisted.web2.iweb.IResource, twisted.web2.resource.Resource, contains an implementation of locateChild which provides more convenient hooks for implementing object traversal. One of these hooks is childFactory. Let us imagine for the sake of example that we wished to render a tree of dictionaries. Our data structure might look something like this:

tree = dict(
    one=dict(
        foo=None,
        bar=None),
    two=dict(
        baz=dict(
        quux=None)))

Given this data structure, the valid URIs would be:

Let us construct a twisted.web2.resource.Resource subclass which uses the default locateChild implementation and overrides the childFactory hook instead:

from twisted.web2 import http, resource, stream

class DictTree(resource.Resource):
    def __init__(self, dataDict):
        self.dataDict = dataDict

    def renderHTTP(self, request):
        if self.dataDict is None:
            content = "Leaf"
        else:
            html = ['<ul>']
            for key in self.dataDict.keys():
                html.extend(['<li><a href="', key, '">', key, '</a></li>'])
            html.append('</ul>')
            content = ''.join(html)

        return http.Response(200, stream=stream.MemoryStream(content))

    def childFactory(self, request, name):
        if name not in self.dataDict:
            return None # 404
        return DictTree(self.dataDict[name])

As you can see, the childFactory implementation is considerably shorter than the equivalent locateChild implementation would have been.

child_* methods and attributes

Often we may wish to have some hardcoded URLs which are not dynamically generated based on some data structure. For example, we might have an application which uses an external CSS stylesheet, an external JavaScript file, and a folder full of images. The twisted.web2.resource.ResourcelocateChild implementation provides a convenient way for us to express these relationships by using child_ prefixed methods:

from twisted.web2 import resource, http, static

class Linker(resource.Resource):
    def renderHTTP(self, request):
        page = """<html>
    <head>
      <link href="css" rel="stylesheet" />
      <script type="text/javascript" src="scripts" />
    <body>
      <img src="images/logo.png" />
    </body>
  </html>"""

        return http.Response(200, stream=stream.MemoryStream(page))

    def child_css(self, request):
        return static.File('/Users/dp/styles.css')

    def child_scripts(self, request):
        return static.File('/Users/dp/scripts.js')

    def child_images(self, request):
        return static.File('/Users/dp/images/')

One thing you may have noticed is that all of the examples so far have returned new object instances whenever they were implementing a traversal API. However, there is no reason these instances cannot be shared. One could for example return a global resource instance, an instance which was previously inserted in a dict, or lazily create and cache dynamic resource instances on the fly. The resource.ResourcelocateChild implementation also provides a convenient way to express that one global resource instance should always be used for a particular url, the child-prefixed attribute:

class FasterLinker(Linker):
    child_css = static.File('/Users/dp/styles.css')
    child_scripts = static.File('/Users/dp/scripts.js')
    child_images = static.File('/Users/dp/images/')

Dots in child names

When a URL contains dots, which is quite common in normal URLs, it is simple enough to handle these URL segments in locateChild or childFactory one of the passed segments will simply be a string containing a dot. However, it is notimmediately obvious how one would express a URL segment with a dot in it when using child-prefixed methods. The solution is really quite simple:

class DotChildren(resource.Resource):
    def render(self, request):
        return http.Response(200, stream="""

  
    
  
""")
    

If you only wish to add a child to specific instance of DotChildren then you should use the putChild method.

rsrc = DotChildren()
rsrc.putChild('child_scripts.js', static.File('/Users/dp/scripts.js'))
    

However if you wish to add a class attribute you can use setattr like so.

setattr(DotChildren, 'child_scripts.js', static.File('/Users/dp/scripts.js'))
    

The same technique could be used to install a child method with a dot in the name.

The default trailing slash handler

When a URI which is being handled ends in a slash, such as when the '/' URI is being rendered or when a directory-like URI is being rendered, the string '' appears in the path segments which will be traversed. Again, handling this case is trivial inside either locateChild or childFactory, but it may not be immediately obvious what child-prefixed method or attribute will be looked up. The method or attribute name which will be used is simply child with a single trailing underscore.

The resource.Resource class provides an implementation of this method which can work in two different ways. If the attribute addSlash is True, the default trailing slash handler will return self. In the case when addSlash is True, the default resource.Resource.renderHTTP implementation will simply perform a redirect which adds the missing slash to the URL.

The default trailing slash handler also returns self if addSlash is false, but emits a warning as it does so. This warning may become an exception at some point in the future.

IRequest.prepath and IRequest.postpath

During object traversal, it may be useful to discover which segments have already been handled and which segments are remaining to be handled. In locateChild the remaining segments are given as the second argument. However, since all object traversal APIs are also passed the request object, this information can also be obtained via the IRequest.prepath and IRequest.postpath attributes.

Conclusion

Twisted.web2 makes it easy to handle complex URL hierarchies. The most basic object traversal interface, twisted.web2.iweb.IResource.locateChild, provides powerful and flexible control over the entire object traversal process. Web2's canonical IResource implementation, resource.Resource, also includes the convenience hooks childFactory along with child-prefixed method and attribute semantics to simplify common use cases.

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/index.html0000644000175000017500000000362411014056243016240 0ustar dokodokoTwisted Documentation: Twisted.Web2 Documentation

Twisted.Web2 Documentation

  1. Introduction
  2. Deployment
  3. APIs and Design principals

Introduction

Deployment

APIs and Design principals

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/headers.html0000644000175000017500000001512511014056243016543 0ustar dokodokoTwisted Documentation: Twisted.Web2 Headers

Twisted.Web2 Headers

  1. Known headers

Known headers

These headers are defined in RFC 2616, and, for their official definitions, refer to that document. This document descibes how they appear after being parsed by web2.

Entity headers:

Request headers (client to server):

Response headers:

General headers:

>Lower level HTTP headers, used only by framework:

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/intro.html0000644000175000017500000003720211014056243016263 0ustar dokodokoTwisted Documentation: Twisted.Web2 Introduction

Twisted.Web2 Introduction

  1. What is twisted.web2
  2. What twisted.web2 is not
  3. Introduction
  4. Simple application

What is twisted.web2

Twisted.web2 is an asynchronous HTTP 1.1 server written for the Twisted internet framework. It provides a RFC 2616 compliant HTTP 1.1 protocol implementation, with pipelined and persistent request support, in a non-blocking threadless manner.

It also includes a simple web framework with request and response objects, static file support, error handling, form upload support, HTTP range support, pre-built parsers for all standard headers, and a bunch of other goodies.

It is deployable as a standalone HTTP or HTTPS server, as a HTTP[S] server proxied behind another server, or as a SCGI, FastCGI, or CGI script.

In addition to running native twisted.web2 applications, it can also run any WSGI or CGI application, or, via compatibility wrappers, most applications written for the older twisted.web API.

Currently, twisted.web2 does not include a HTTP client or proxy, but will at a future date.

What twisted.web2 is not

Twisted.web2 is not a templating framework. It provides mechanisms for locating and running code associated with a URL, but does not provide any means for separating code and data or to ease the task of generating HTML.

Twisted.web2 is in general fairly speedy. However, keep in mind that it is a python program, and, while it is empirically "fast enough", it cannot match Apache in static file serving speed. <insert actual measurements here>

Introduction

This tutorial should be readable by people who do not have much Twisted experience yet, but, you should know Python, and HTML, before starting. While it is hopefully redundant to say this, you also ought to have installed twisted, and twisted.web2.

When you have finished this tutorial, you should be able to write some simple resources and publish them with twisted.web2.

Simple application

from twisted.web2 import server, http, resource, channel

class Toplevel(resource.Resource):
  addSlash = True
  def render(self, ctx):
	return http.Response(stream="Hello monkey!")

site = server.Site(Toplevel())

# Standard twisted application Boilerplate
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8080', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 1: A simple application - ../examples/intro/simple.py

You may run this program via twistd -ny simple.py. twistd is the Twisted runner; it knows how to execute applications by looking for the application variable declared at top-level. You can also run your server in the background (daemonized), via twistd -y simple.py. You can access your server via the url "http://localhost:8080/". For more deployment options, see the deployment chapter.

What this is doing

A resource is responsible for handling one segment of the URL. Here, we have created a resource to handle the top level of the url hierarchy. The addSlash = True setting tells twisted.web2 that this is a directory-like resource. This means that it will add a "/" to the end of the URL automatically, if needed, and respond under that name. Root resources should always have addSlash = True.

The defined class has just a single method: render. This method takes a single argument: request, which contains all the state related to the current rendering operation. This particular render method always returns the same data, so we won't use request. We'll get back to it later.

Here, the render method simply returns a http.Response object containing the output.

After defining this class, next we need to tell twisted.web2 to serve it up. This is accomplished by creating the server.Site object, with an instance of your top-level resource as its argument, and then some standard boilerplate to tell Twisted what services to start and what port to serve them on.

Child resources

Of course, what good is a webserver that can only serve up a single page? So, you can also add child resources to your top-level resource, via child_<name> attributes.

import os.path, time
from twisted.web2 import server, http, resource, channel
from twisted.web2 import static, http_headers, responsecode

class Child(resource.Resource):
  creation_time = time.time()
  text = 'Yo Ho Ho and a bottle of rum.'
  content_type = http_headers.MimeType('text', 'plain')

  def render(self, ctx):
    return http.Response(
      responsecode.OK,
      {'last-modified': self.creation_time,
      'etag': http_headers.ETag(str(hash(self.text))),
      'content-type': self.content_type},
      self.text)

class Toplevel(resource.Resource):
  addSlash = True
  child_monkey = static.File(os.path.dirname(static.__file__)+'/static.py')
  child_elephant = Child()

  def render(self, ctx):
    return http.Response(
      200,
      {'content-type': http_headers.MimeType('text', 'html')},
      """<html><body>
      <a href="monkey">The source code of twisted.web2.static</a><br>
      <a href="elephant">A defined child</a></body></html>""")

site = server.Site(Toplevel())

# Standard twisted application Boilerplate
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8080', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 2: Child Resources - ../examples/intro/children.py

Here a few new concepts have been introduced:

As an aside for those who know a bit about HTTP, note that just by setting the Last-Modified and ETag response headers, you enable automatic precondition checks which support the If-Modified-Since, If-Unmodified-Since, If-Match, and If-None-Match input headers. This allows the client to request that the resource only be sent if it has changed since the last time the client downloaded it, saving bandwidth. Also, the Range and If-Range headers are supported on every resource, allowing partial downloads, and default values for the "Server" and "Date" headers are added to the output for you automatically.

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/tap-deploy.html0000644000175000017500000001572211014056243017211 0ustar dokodokoTwisted Documentation: Twisted.Web2 Deployment with mktap

Twisted.Web2 Deployment with mktap

  1. Simple Servers
  2. Virtual Hosts
  3. Conclusion

While Twisted.Web2 can be deployed in a variety of flexible and complicated ways, occasionally a simpler approach is desired. For this Web2 makes use of TAPs created by the mktap commandline utility. This document outlines a few of the approaches for creating, configuring, and deploying a Twisted.Web2 with mktap.

Since the Web2 mktap plugin is a work in progress it is suggested that you refer to the output of the following command for further information

% mktap web2 --help
    

Simple Servers

Static Files

Perhaps the simplest possible Twisted.Web2 configuration is to serve a bunch of static files from a directory.

% mktap web2 --path /some/path
    

In case you've forgotten mktap 101 this will create a file in the current directory called web2.tap, you can then launch this server configuration with the following command.

% twistd -nf web2
2006/03/02 00:29 PST [-] Log opened.
2006/03/02 00:29 PST [-] twistd SVN-Trunk (/usr/bin/python 2.4.2) starting up
2006/03/02 00:29 PST [-] reactor class: twisted.internet.selectreactor.SelectReactor
2006/03/02 00:29 PST [-] Loading web2.tap...
2006/03/02 00:29 PST [-] Loaded.
2006/03/02 00:29 PST [-] twisted.web2.channel.http.HTTPFactory starting on 8080
2006/03/02 00:29 PST [-] Starting factory <twisted.web2.channel.http.HTTPFactory instance at 0x7787ee4c>
    

You now have a HTTP server serving static files on port 8080, and if you open it in a web browser you'll see something like this in your terminal.

2006/03/02 00:29 PST [HTTPChannel,0,127.0.0.1] 127.0.0.1 - - [02/Mar/2006:00:29:14 -0700] "GET / HTTP/1.1" 200 2577 "-" "Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.1) Gecko/20060224 Ubuntu/dapper Firefox/1.5.0.1"
    

By default the TAP plugin logs to the standard twistd logfile. However if you specify the --logfile option twistd will log to the specified file in the Common Access Logging format.

% mktap web2 --path /some/path --logfile ./access.log
    

Dynamic Resources

Twisted.Web2's tap plugin is also perfectly capable of serving up a dynamic Resource or two. In the name of completeness here is a simple resource.

from twisted.web2 import http, resource

class HelloWorld(resource.Resource):
    def render(self, req):
        return http.Response(200, stream="Hello, World!")
Listing 1: Hello World - ../examples/hello.py

To use it from mktap you simply have to make sure it's in your PYTHONPATH and tell mktap to use it as its root resource.

% mktap web2 --class=hello.HelloWorld
    

It's important to keep in mind that this class will be created with no arguments passed to the constructor.

Virtual Hosts

Now for something a little different, and a little more advanced. The TAP plugin supports several ways of configuring a server that uses Named Virtual Hosting.

Just a bunch of directories

The first method of serving virtual hosts involves a bunch of directories that correspond to the root directory of a virtual host.

For example:

% ls servers
test.example.com
foo.example.com
mail.example.com
% mktap web2 --vhost-path servers/
    

Each of the directories under servers is served out as a static.File when you visit the appropriate url, for example http://test.example.com:8080/ will give you the contents of servers/test.example.com (assuming test.example.com points to the actual place where the server is running.)

Adding a Single Virtual Host

You can also add a single virtual host at a time, either in a seperate directory structure with --vhost-static or as a dynamic resource with --vhost-class. You can use as many of these arguments as you wish, even combining them with --vhost-path.

For example the following command will give us a web2.tap that serves two virtual hosts, images.example.com and example.com which will serve our dynamic application ( Hello World.)

% mktap web2 --vhost-static=images.example.com=images/ --vhost-class=example.com=hello.HelloWorld
    

Conclusion

Web2's TAP plugin is a great way to get start a server and start playing around. However there are many other ways to deploy web2, and the TAP plugin is meant to be a stepping stone to more advanced techniques such as those mentioned in the deployment howto.

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/authentication.html0000644000175000017500000005253411014056243020154 0ustar dokodokoTwisted Documentation: HTTP Authentication with Twisted.Web2

HTTP Authentication with Twisted.Web2

  1. Overview
  2. Cred
  3. Credential Factories
  4. The HTTPAuthResource

Overview

twisted.web2.auth implements Digest and Basic HTTP Authentication as specified by RFC 2617. This document attempts to describe:

Cred

twisted.cred is a pluggable authentication framework which allows application/protocol developers to easily support multiple authentication types regardless of the backend used. This document assumes some familiarity with cred and suggests you read Cred: Pluggable Authentication and the twisted.cred API Reference for further information. However several of the application specific implementations of objects required by cred are listed below.

from zope.interface import Interface, implements
from twisted.cred import portal
from twisted.web2.auth.interfaces import IHTTPUser

class HTTPUser(object):
    """
    A user that authenticated over HTTP Auth.
    """
    implements(IHTTPUser)

    username = None

    def __init__(self, username):
        """
        @param username: The str username sent as part of the HTTP auth
            response.
        """
        self.username = username


class HTTPAuthRealm(object):
    implements(portal.IRealm)

    def requestAvatar(self, avatarId, mind, *interfaces):
        if IHTTPUser in interfaces:
            return IHTTPUser, HTTPUser(avatarId)

        raise NotImplementedError("Only IHTTPUser interface is supported")
Listing 1: Cred Setup - ../examples/auth/credsetup.py

Credential Factories

Credential Factories as defined by ICredentialFactory are the heart of HTTP Authentication. Their functions are two-fold:

  1. They provide the challenges and decode the responses from the client, while maintaining state for stateful authentication schemes such as Digest.
  2. They are used to define and determine which authentication schemes should be used during authentication

The ICredentialFactory interface defines the following:

The HTTPAuthResource

The purpose of HTTPAuthResource is to trap both locateChild and renderHTTP and require authentication before allowing requests to pass on to it's wrappedResource. It does this by returning an UnauthorizedResource if the following conditions are not met:

Usage By Example

from twisted.web2 import channel, resource, http, responsecode, server
from twisted.web2.auth.interfaces import IAuthenticatedRequest, IHTTPUser

class ProtectedResource(resource.Resource):
    """
    A resource that is protected by HTTP Auth
    """
    addSlash = True

    def render(self, req):
        """
        I adapt C{req} to an L{IAuthenticatedRequest} before using the
        avatar to return a personalized message.
        """
        avatar = IAuthenticatedRequest(req).avatar

        return http.Response(
            responsecode.OK,
            stream=("Hello %s, you've successfully accessed "
                    "a protected resource." % (avatar.username,)))

from twisted.web2.auth import digest, basic, wrapper

from twisted.cred.portal import Portal
from twisted.cred import checkers

import credsetup

#
# Create the portal with our realm that knows about the kind of avatar
# we want.
#

portal = Portal(credsetup.HTTPAuthRealm())

#
# Create a checker that knows about the type of backend we want to use
# and that knows about the ICredentials we get back from our
# ICredentialFactories.  And tell our portal to use it.
#

checker = checkers.InMemoryUsernamePasswordDatabaseDontUse(guest='guest123')

portal.registerChecker(checker)

#
# Set up our HTTPAuthResource, we have to tell it the root of the resource
# heirarchy we want to protect, as well as the credential factories we want
# to support, the portal we want to use for logging in, and the interfaces
# that IAuthenticatedRequest.avatar to may implement.
#

root = wrapper.HTTPAuthResource(ProtectedResource(),
                                (basic.BasicCredentialFactory('My Realm'),
                                 digest.DigestCredentialFactory('md5',
                                                               'My Realm')),
                                portal, (IHTTPUser,))

site = server.Site(root)

# Start up the server
from twisted.application import service, strports
application = service.Application("HTTP Auth Demo")
s = strports.service('tcp:8080', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 2: Working HTTPAuthResource Example - ../examples/auth/httpauth.tac

This simple example consists of the following application specific components.

  1. A Resource we wish to protect from unauthorized access, in this case it is our ProtectedResource
  2. A portal using our realm from Listing 1, and having a single ICredentialCheckers. In this case a simple checker that stores usernames and passwords in memory and should not be used for anything other than as an example.
  3. A single ICredentialFactory, in this case a DigestCredentialFactory using the md5 algorithm and with a realm of "My Realm"
  4. A sequence of avatar interfaces consisting of our IHTTPUser as defined in Listing 1

Things HTTPAuthResource doesn't do

HTTPAuthResource is provided largely as a lowest common denominator authentication solution. As a result, it has a few limitations:

As a result of these limitations HTTPAuthResource is provided more as an example of how you can work with twisted.web2.auth rather than as a definitive solution.

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/deployment.html0000644000175000017500000004356711014056243017323 0ustar dokodokoTwisted Documentation: Twisted.web2 Deployment

Twisted.web2 Deployment

  1. Standalone HTTP
  2. HTTP behind Apache2
  3. HTTP behind Apache1
  4. SCGI
  5. FastCGI
  6. CGI

There are a number of possibilities for deploying twisted.web2: as standalone HTTP[S] server, HTTP proxied behind another server, SCGI, FastCGI, or CGI.

Deploying as a standalone HTTP/HTTPS server is by far the simplest. Unless you have a reason not to, it is recommended that you choose this option. However, many people already run web servers on their computer and are not willing or able to completely blow it away and replace it with twisted.web2. The next best option is to run twisted.web2 as a server proxied behind your existing webserver, using either HTTP or SCGI.

Standalone HTTP

For completeness, here is a simple standalone HTTP server again.

from twisted.web2 import server, channel, static

# For example, serve the /tmp directory
toplevel = static.File("/tmp")
site = server.Site(toplevel)

# Start up the server
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8080', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 1: A standalone HTTP server - ../examples/deployment/standalone.tac

HTTP behind Apache2

If you use HTTP proxying, you must inform twisted.web2 of the real URL it is being accessed by, or else any URLs it generates will be incorrect. You can do this via the AutoVHostURIRewrite resource when using apache2 as the main server.

On the apache side, configure as follows. Apache automatically sends the original host in the X-Forwarded-Host header, and the original remote IP address in the X-Forwarded-For header. You must additionally send along the original path, and the original scheme.

For proxying a subdirectory:

<Location /whatever/>
ProxyPass http://localhost:8538/
RequestHeader set X-App-Location /whatever/
RequestHeader set X-App-Scheme http

</Location>
    

Or, for serving an entire HTTPS virtual host:

<VirtualHost myip:443>
ServerName example.com
ProxyPass / http://localhost:8538/
RequestHeader set X-App-Location /
RequestHeader set X-App-Scheme https
</VirtualHost>
    

Now, on the twisted.web2 side

from twisted.web2 import server, channel, static, vhost

# For example, serve the /tmp directory
toplevel = static.File("/tmp")
# Use the automatic uri rewriting based on apache2 headers
toplevel = vhost.AutoVHostURIRewrite(toplevel)
site = server.Site(toplevel)

# Start up the server
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8538', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 2: Behind Apache 2 - ../examples/deployment/apache2.tac

HTTP behind Apache1

Apache 1 doesn't provide the X-Forwarded-Host or X-Forwarded-For headers, or the ability to set custom headers in the outgoing proxy request. Therefore, you must provide that information to twisted.web2 directly. This is accomplished by the VHostURIRewrite resource.

Setup apache as follows:

<VirtualHost myip>
ServerName example.com
ProxyPass /foo/ http://localhost:8538/
</VirtualHost>
    

And twisted like so

from twisted.web2 import server, channel, static, vhost

# For example, serve the /tmp directory
toplevel = static.File("/tmp")
# Add the rewriter.
toplevel = vhost.VHostURIRewrite("http://myhostname.com/foo/", toplevel)
site = server.Site(toplevel)

# Start up the server
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8538:interface=127.0.0.1', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 3: Behind Apache 1 - ../examples/deployment/apache1.tac

Because vhost.VHostURIRewrite can exist anywhere in the resource tree, you can have multiple applications running on a single twisted port by making them siblings of a root resource and referencing their full path in the ProxyPass directive.

Setup apache as follows:

<VirtualHost foo.myhostname.com>
ProxyPass / http://localhost:8538/foo/
ServerName example.com
</VirtualHost>

<VirtualHost bar.myhostname.com>
ProxyPass / http://localhost:8538/bar/
ServerName example.com
</VirtualHost>
    

And twisted like so

from twisted.web2 import server, channel, resource, static, vhost

# For example, server the /tmp/foo directory
foo_toplevel = static.File("/tmp/foo")
# And the /tmp/bar directory
bar_toplevel = static.File("/tmp/bar")
# Add the rewriters:
foo_toplevel = vhost.VHostURIRewrite("http://foo.myhostname.com/",
      foo_toplevel)
bar_toplevel = vhost.VHostURIRewrite("http://bar.myhostname.com/",
      bar_toplevel)

toplevel = resource.Resource()
toplevel.putChild('foo', foo_toplevel)
toplevel.putChild('bar', bar_toplevel)
site = server.Site(toplevel)

# Start up the server
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:8538:interface=127.0.0.1', channel.HTTPFactory(site))
s.setServiceParent(application)
Listing 4: Multiple hosts behind Apache 1 - ../examples/deployment/apache1_twohosts.tac

SCGI

SCGI is an alternative to HTTP proxying. SCGI should work instead of HTTP proxying from servers which support it. Additionally, if all you have access to from the web server is CGI, but are able to run long-running processes, you can use the cgi2scgi C program to channel CGI requests to your twisted.web2 SCGI port. This won't be as efficient as mod_scgi or http proxying, but it will be much better than using twisted directly as a CGI.

FIXME:Someone who has installed mod_scgi in apache should write a bit on it.

Configure Twisted as follows

from twisted.web2 import server, channel, static

# For example, serve the /tmp directory
toplevel = static.File("/tmp")
site = server.Site(toplevel)

# Start up the server
from twisted.application import service, strports
application = service.Application("demoserver")
s = strports.service('tcp:3000', channel.SCGIFactory(site))
s.setServiceParent(application)
Listing 5: An SCGI Server - ../examples/deployment/scgi.tac

FastCGI

FastCGI is another popular way to run a web application. Blah blah.

CGI

CGI is the worst possible deployment environment, yet in some cases it may be all that is possible. It allows only a single request to be served from a process, so any kind of in-memory storage is impossible. Also, the overhead of starting up a new python interpreter for every request can get quite high. You should only consider using it if your hosting provider does not allow you to keep a process running.

However, if it's your only choice, you can deploy a twisted.web2 app using it. Unlike the other examples, where we create a .tac file for running with twistd, in this case, a standalone python script is necessary

#!/usr/bin/env python
  from twisted.web2 import channel, server, static
  toplevel = static.File(/tmp)
  site = server.Site(toplevel)
  channel.startCGI(site)

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/howto/resource-apis.html0000644000175000017500000002012011014056243017700 0ustar dokodokoTwisted Documentation: Resource, Request, and Response

Resource, Request, and Response

  1. Resources
  2. Response
  3. Request

The three main APIs you will have to be concerned about, as a normal user of the framework are the Resource, Request, and Response objects.

Resources

The core resource API is described by twisted.web2.iweb.IResource

A resource (twisted.web2.resource.Resource) will generally be the superclass of the classes you define. As you saw in the intro document, it supports two operations: rendering and locating a child resource. This is described in more detail in object traversal

Response

The response object (twisted.web2.http.Response) contains the state which will be sent back to the client. You construct one as follows:

Response(code=None, headers=None, stream=None)

The arguments, in detail are:

  1. Response code. This should be one of the standard HTTP response codes, either as defined in the twisted.web2.responsecode module, or, equivilently, just an integer. If left unspecified, the default is responsecode.OK, or 200.
  2. Headers. The headers, as stored in the response object are an instance of twisted.web2.http_headers.Headers. For convenience, you may also simply pass a dictionary of name to value which will automatiaclly be turned into the Headers instance for you. Please note that the values used here are not the direct string representations that will be sent to the client, but rather, an already-parsed representation. This is to centralize the tricky business of parsing HTTP headers correctly, and to ensure that issues like quoting are taken care of automatically. See Headers for details about the parsed representation for each header. If left unspecified, only the default headers added by the core are output.
  3. The output stream. At the simplest level, you can simply pass a string for this argument, and the string will be output. However, underlying this is a much more powerful system which allows for the efficient streaming output of arbitrarily large data from a file or other sources, such as a CGI process or an outgoing HTTP request to another server. This is accomplished by providing an implementor of twisted.web2.stream.IByteStream. For more detail on streams, see the twisted.web2.stream module.

Request

The request object holds all the data regarding this particular incoming connection from the client. There are two requst objects in web2: the core http request in twisted.web2.http.Request, and the application server subclass of that in twisted.web2.server.Request. The second is the one you will be using, and that is described here. The first is a subset thereof that is only interesting to someone wanting to replace the application server portion of twisted.web2.

  1. method - Request method. This is the HTTP method, e.g. "GET" or "HEAD" or "POST".
  2. headers - A twisted.web2.http_headers.Headers instance.
  3. stream - The incoming data stream, an implementor of twisted.web2.stream.IByteStream
  4. remoteAddr - The address of the remote host, a twisted.internet.interfaces.IAddress.

Then there's the attributes that make up a url. Note that all of these, including scheme, host, and port, may be specified by the client:

  1. scheme - the request scheme the user used, e.g. "http" or "https".
  2. host - the hostname the client sent the request to, e.g. "localhost"
  3. port - the port the client sent the request to, e.g. "80"
  4. path - the complete path, as a string
  5. params - The url "parameters". This is an obscure part of the url spec that you're unlikely to ever have a use for.
  6. querystring - The query objarguments as a string.
  7. args - The parsed form arguments, as a dictionary, including POST arguments if applicable. This is in the form of a dictionary of lists. The string "?a=1&c=1&c=2" will get turned into {'a':['1'], 'c':['1', '2']}.

Then, the pieces of the parsed url as its being traversed:

  1. prepath - A list of url segments that have already been processed by a locateChild method.
  2. postpath - A list of url segments yet to be processed.

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/examples/0000755000175000017500000000000011014056244014715 5ustar dokodokoTwistedWeb2-8.1.0/doc/examples/deployment/0000755000175000017500000000000011014056216017074 5ustar dokodokoTwistedWeb2-8.1.0/doc/examples/deployment/apache1_twohosts.tac0000644000175000017500000000142310376767501023061 0ustar dokodokofrom twisted.web2 import server, channel, resource, static, vhost # For example, server the /tmp/foo directory foo_toplevel = static.File("/tmp/foo") # And the /tmp/bar directory bar_toplevel = static.File("/tmp/bar") # Add the rewriters: foo_toplevel = vhost.VHostURIRewrite("http://foo.myhostname.com/", foo_toplevel) bar_toplevel = vhost.VHostURIRewrite("http://bar.myhostname.com/", bar_toplevel) toplevel = resource.Resource() toplevel.putChild('foo', foo_toplevel) toplevel.putChild('bar', bar_toplevel) site = server.Site(toplevel) # Start up the server from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8538:interface=127.0.0.1', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/deployment/apache1.tac0000644000175000017500000000074310376767501021113 0ustar dokodokofrom twisted.web2 import server, channel, static, vhost # For example, serve the /tmp directory toplevel = static.File("/tmp") # Add the rewriter. toplevel = vhost.VHostURIRewrite("http://myhostname.com/foo/", toplevel) site = server.Site(toplevel) # Start up the server from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8538:interface=127.0.0.1', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/deployment/apache2.tac0000644000175000017500000000073410376767501021114 0ustar dokodokofrom twisted.web2 import server, channel, static, vhost # For example, serve the /tmp directory toplevel = static.File("/tmp") # Use the automatic uri rewriting based on apache2 headers toplevel = vhost.AutoVHostURIRewrite(toplevel) site = server.Site(toplevel) # Start up the server from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8538', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/deployment/scgi.tac0000644000175000017500000000055310376767501020535 0ustar dokodokofrom twisted.web2 import server, channel, static # For example, serve the /tmp directory toplevel = static.File("/tmp") site = server.Site(toplevel) # Start up the server from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:3000', channel.SCGIFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/deployment/standalone.tac0000644000175000017500000000055310376767501021740 0ustar dokodokofrom twisted.web2 import server, channel, static # For example, serve the /tmp directory toplevel = static.File("/tmp") site = server.Site(toplevel) # Start up the server from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8080', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/auth/0000755000175000017500000000000011014056216015655 5ustar dokodokoTwistedWeb2-8.1.0/doc/examples/auth/credsetup.py0000644000175000017500000000133610563450446020243 0ustar dokodokofrom zope.interface import Interface, implements from twisted.cred import portal from twisted.web2.auth.interfaces import IHTTPUser class HTTPUser(object): """ A user that authenticated over HTTP Auth. """ implements(IHTTPUser) username = None def __init__(self, username): """ @param username: The str username sent as part of the HTTP auth response. """ self.username = username class HTTPAuthRealm(object): implements(portal.IRealm) def requestAvatar(self, avatarId, mind, *interfaces): if IHTTPUser in interfaces: return IHTTPUser, HTTPUser(avatarId) raise NotImplementedError("Only IHTTPUser interface is supported") TwistedWeb2-8.1.0/doc/examples/auth/httpauth.tac0000644000175000017500000000411610563450446020224 0ustar dokodokofrom twisted.web2 import channel, resource, http, responsecode, server from twisted.web2.auth.interfaces import IAuthenticatedRequest, IHTTPUser class ProtectedResource(resource.Resource): """ A resource that is protected by HTTP Auth """ addSlash = True def render(self, req): """ I adapt C{req} to an L{IAuthenticatedRequest} before using the avatar to return a personalized message. """ avatar = IAuthenticatedRequest(req).avatar return http.Response( responsecode.OK, stream=("Hello %s, you've successfully accessed " "a protected resource." % (avatar.username,))) from twisted.web2.auth import digest, basic, wrapper from twisted.cred.portal import Portal from twisted.cred import checkers import credsetup # # Create the portal with our realm that knows about the kind of avatar # we want. # portal = Portal(credsetup.HTTPAuthRealm()) # # Create a checker that knows about the type of backend we want to use # and that knows about the ICredentials we get back from our # ICredentialFactories. And tell our portal to use it. # checker = checkers.InMemoryUsernamePasswordDatabaseDontUse(guest='guest123') portal.registerChecker(checker) # # Set up our HTTPAuthResource, we have to tell it the root of the resource # heirarchy we want to protect, as well as the credential factories we want # to support, the portal we want to use for logging in, and the interfaces # that IAuthenticatedRequest.avatar to may implement. # root = wrapper.HTTPAuthResource(ProtectedResource(), (basic.BasicCredentialFactory('My Realm'), digest.DigestCredentialFactory('md5', 'My Realm')), portal, (IHTTPUser,)) site = server.Site(root) # Start up the server from twisted.application import service, strports application = service.Application("HTTP Auth Demo") s = strports.service('tcp:8080', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/intro/0000755000175000017500000000000011014056216016047 5ustar dokodokoTwistedWeb2-8.1.0/doc/examples/intro/children.py0000644000175000017500000000232410376767501020232 0ustar dokodokoimport os.path, time from twisted.web2 import server, http, resource, channel from twisted.web2 import static, http_headers, responsecode class Child(resource.Resource): creation_time = time.time() text = 'Yo Ho Ho and a bottle of rum.' content_type = http_headers.MimeType('text', 'plain') def render(self, ctx): return http.Response( responsecode.OK, {'last-modified': self.creation_time, 'etag': http_headers.ETag(str(hash(self.text))), 'content-type': self.content_type}, self.text) class Toplevel(resource.Resource): addSlash = True child_monkey = static.File(os.path.dirname(static.__file__)+'/static.py') child_elephant = Child() def render(self, ctx): return http.Response( 200, {'content-type': http_headers.MimeType('text', 'html')}, """ The source code of twisted.web2.static
A defined child""") site = server.Site(Toplevel()) # Standard twisted application Boilerplate from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8080', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/intro/simple.py0000644000175000017500000000070010376767501017727 0ustar dokodokofrom twisted.web2 import server, http, resource, channel class Toplevel(resource.Resource): addSlash = True def render(self, ctx): return http.Response(stream="Hello monkey!") site = server.Site(Toplevel()) # Standard twisted application Boilerplate from twisted.application import service, strports application = service.Application("demoserver") s = strports.service('tcp:8080', channel.HTTPFactory(site)) s.setServiceParent(application) TwistedWeb2-8.1.0/doc/examples/hello.py0000644000175000017500000000024310410353354016372 0ustar dokodokofrom twisted.web2 import http, resource class HelloWorld(resource.Resource): def render(self, req): return http.Response(200, stream="Hello, World!") TwistedWeb2-8.1.0/doc/examples/index.html0000644000175000017500000000413511014056244016715 0ustar dokodokoTwisted Documentation: Twisted.Web2 code examples

Twisted.Web2 code examples

  1. Demo
  2. Introduction
  3. Deployment
  4. Authentication

Demo

Introduction

Deployment

Authentication

Index

Version: 8.1.0TwistedWeb2-8.1.0/doc/examples/demo.py0000644000175000017500000001302210613122074016210 0ustar dokodoko#!/usr/bin/env python # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """I am a simple test resource. """ import os.path import cgi as pycgi from twisted.web2 import log from twisted.web2 import static, wsgi, resource, responsecode, twcgi from twisted.web2 import stream, http, http_headers from twisted.internet import reactor ### A demo WSGI application. def simple_wsgi_app(environ, start_response): status = '200 OK' response_headers = [('Content-type','text/html; charset=ISO-8859-1')] start_response(status, response_headers) data = environ['wsgi.input'].read() environ['wsgi.errors'].write("This is an example wsgi error message\n") s = '
'
    items=environ.items()
    items.sort()
    for k,v in items:
        s += repr(k)+': '+repr(v)+'\n'
    return [s, '

\nData:


', data, '
'] ### Demonstrate a simple resource which renders slowly. class Sleepy(resource.Resource): def render(self, req): # Create a stream object which can be written in pieces. s=stream.ProducerStream() # Write a string, and then, later, write another string, and # call it done. (Also write spaces so browsers don't wait # before displaying anything at all) s.write("Hello\n") s.write(' '*10000+'\n') reactor.callLater(1, s.write, "World!\n") reactor.callLater(2, s.finish) # Return a response. Use the default response code of OK, and # the default headers return http.Response(stream=s) ### Form posting class FormPost(resource.PostableResource): def render(self, req): return http.Response(responsecode.OK, {'content-type': http_headers.MimeType('text', 'html')}, """ Form1, x-www-form-urlencoded:

Form2, multipart/form-data:

Arg dict: %r, Files: %r""" % (req.args, req.files)) ### Toplevel resource. This is a more normal resource. class Toplevel(resource.Resource): # addSlash=True to make sure it's treated as a directory-like resource addSlash=True # Render the resource. Here the stream is a string, which will get # adapted to a MemoryStream object. def render(self, req): contents = """ Twisted.web2 demo server Hello! This is a twisted.web2 demo.

""" return http.Response( responsecode.OK, {'content-type': http_headers.MimeType('text', 'html')}, contents) # Add some child resources child_file = static.File(os.path.join(os.path.dirname(resource.__file__), 'TODO')) child_dir = static.File('.') child_sleepy = Sleepy() child_wsgi = wsgi.WSGIResource(simple_wsgi_app) child_cgi = twcgi.FilteredScript(pycgi.__file__, filters=["/usr/bin/python"]) child_forms = FormPost() ######## Demonstrate a bunch of different deployment options ######## ### You likely only want one of these for your app. # This part gets run when you run this file via: "twistd -noy demo.py" if __name__ == '__builtin__': from twisted.application import service, strports from twisted.web2 import server, vhost, channel from twisted.python import util # Create the resource we will be serving test = Toplevel() # Setup default common access logging res = log.LogWrapperResource(test) log.DefaultCommonAccessLoggingObserver().start() # Create the site and application objects site = server.Site(res) application = service.Application("demo") # Serve it via standard HTTP on port 8080 s = strports.service('tcp:8080', channel.HTTPFactory(site)) s.setServiceParent(application) # Serve it via HTTPs on port 8081 certPath = util.sibpath(__file__, os.path.join("..", "..", "core", "examples", "server.pem")) s = strports.service('ssl:8081:privateKey=%s' % certPath, channel.HTTPFactory(site)) s.setServiceParent(application) # Serve it via SCGI on port 3000 s = strports.service('tcp:3000', channel.SCGIFactory(site)) s.setServiceParent(application) # Serve it via FastCGI on port 3001 s = strports.service('tcp:3001', channel.FastCGIFactory(site)) s.setServiceParent(application) # Serve it via HTTP on port 8538, with a url rewriter for running behind apache1. # (See deployment documentation for apache setup) s = strports.service( 'tcp:8538:interface=127.0.0.1', channel.HTTPFactory(server.Site( vhost.VHostURIRewrite('http://localhost/app/', test)))) s.setServiceParent(application) # This bit gets run when you run this script as a CGI from another webserver. if __name__ == '__main__': from twisted.web2 import channel, server toplevel = Toplevel() channel.startCGI(server.Site(toplevel)) TwistedWeb2-8.1.0/setup.py0000644000175000017500000000145010431431321014037 0ustar dokodokoimport sys try: from twisted.python import dist except ImportError: raise SystemExit("twisted.python.dist module not found. Make sure you " "have installed the Twisted core package before " "attempting to install any other Twisted projects.") if __name__ == '__main__': dist.setup( twisted_subproject="web2", # metadata name="Twisted Web2", description="Twisted Web2 is a web server.", author="Twisted Matrix Laboratories", author_email="twisted-python@twistedmatrix.com", maintainer="James Knight", maintainer_email="foom@fuhm.net", url="http://twistedmatrix.com/trac/wiki/TwistedWeb2", license="MIT", long_description="Twisted Web2 is a web server.", ) TwistedWeb2-8.1.0/NEWS0000644000175000017500000000354211014050361013027 0ustar dokodoko8.1.0 (2008-05-18) ================== Fixes ----- - The deprecated mktap API is no longer used (#3127) 8.0.1 (2008-03-26) ================== Features -------- - The HTTP authentication support has been improved (#2042, #2104, #2460) - PostableResource now allows greater configurability (#2836) Fixes ----- - static.FileSaver no longer incorrectly saves files with '\r\n' on Windows (#1979) - SCRIPT_NAME handling in CGI scripts is no longer inappropriately set (#2075) - A bug in which http_ methods were called with invalid arguments when checkPrecondition returned a Deferred has been fixed (#3084) Deprecations and Removals ------------------------- - web2.dav has been removed (#3072) Misc ---- - #2102, #1906, #2640, #2612, #1042, #2304 0.2.0 (June 9, 2006) ==================== Features -------- - WebDAV Level 1 support. - HTTP-AUTH Basic and Digest Authentication. - Low-Level HTTP Client implementation. - A mktap plugin for a quick start at setting up a Web2 Server. - XML-RPC Server-side support. Bugfixes -------- - AutoVHostURIRewrite improvements - Proxy Compatibility - #1700 - Now exposes the client IP - #1699 - Proper WWW-Authenticate header parsing - #1723 - Fixed exception when using SCGIClientResource with logging. - #1755 0.1.0 (June 16, 2005) ===================== Major Features -------------- - Streaming upload data - Support for multiple headers of the same name - Separation of low level HTTP and high level request handling which allows it to run under other transports such as SCGI and CGI - IResource API improvements from Nevow - More versatile outgoing data streaming API - Correct header parsing - Full HTTP/1.1 support - Output filters (HTTP range support in a generic fashion, and gzip support) - Significantly better URI Rewriting when used behind a reverse proxy such as Apache's mod_proxy TwistedWeb2-8.1.0/README0000644000175000017500000000231310254433247013220 0ustar dokodokoTwisted.Web2 ============ Twisted.Web2 is the next generation Web Server Framework built with Twisted. Web2 is under active development and it's APIs should not be considered stable at this point. It is not a version of Twisted.Web and with that in mind compatibility is not of the highest concern, though the compatibility layer does support many but not all twisted.web resources. Improvements over Twisted.Web ----------------------------- * Streaming upload data * Support for multiple headers of the same name * Separation of low level HTTP and high level request handling which allows it to run under other transports such as SCGI and CGI * IResource API improvements from Nevow * More versatile outgoing data streaming API * Correct header parsing * Full HTTP/1.1 support * Output filters (HTTP range support in a generic fashion, and gzip support) * Significantly better URI Rewriting when used behind a reverse proxy such as Apache's mod_pxy Things to further improve ------------------------- * Speed, it's not very fast, depending on how the benchmarks are done it's either significantly faster than Twisted.Web or twice as slow. * Better twisted.web compatibility * More and better tests TwistedWeb2-8.1.0/twisted/0000755000175000017500000000000011017352661014022 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/0000755000175000017500000000000011017352661014661 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/test/0000755000175000017500000000000011014056216015632 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/test/test_http_headers.py0000644000175000017500000007175010437435131021734 0ustar dokodokofrom twisted.trial import unittest import random, time from twisted.web2 import http_headers from twisted.web2.http_headers import Cookie, HeaderHandler from twisted.python import util class parsedvalue: """Marker class""" def __init__(self, raw): self.raw = raw def __eq__(self, other): return isinstance(other, parsedvalue) and other.raw == self.raw class HeadersAPITest(unittest.TestCase): """Make sure the public API exists and works.""" def testRaw(self): rawvalue = ("value1", "value2") h = http_headers.Headers(handler=HeaderHandler(parsers={}, generators={})) h.setRawHeaders("test", rawvalue) self.assertEquals(h.hasHeader("test"), True) self.assertEquals(h.getRawHeaders("test"), rawvalue) self.assertEquals(list(h.getAllRawHeaders()), [('Test', rawvalue)]) self.assertEquals(h.getRawHeaders("foobar"), None) h.removeHeader("test") self.assertEquals(h.getRawHeaders("test"), None) def testParsed(self): parsed = parsedvalue(("value1", "value2")) h = http_headers.Headers(handler=HeaderHandler(parsers={}, generators={})) h.setHeader("test", parsed) self.assertEquals(h.hasHeader("test"), True) self.assertEquals(h.getHeader("test"), parsed) self.assertEquals(h.getHeader("foobar"), None) h.removeHeader("test") self.assertEquals(h.getHeader("test"), None) def testParsedAndRaw(self): def parse(raw): return parsedvalue(raw) def generate(parsed): return parsed.raw rawvalue = ("value1", "value2") rawvalue2 = ("value3", "value4") handler = HeaderHandler(parsers={'test':(parse,)}, generators={'test':(generate,)}) h = http_headers.Headers(handler=handler) h.setRawHeaders("test", rawvalue) self.assertEquals(h.getHeader("test"), parsedvalue(rawvalue)) h.setHeader("test", parsedvalue(rawvalue2)) self.assertEquals(h.getRawHeaders("test"), rawvalue2) # Check the initializers h = http_headers.Headers(rawHeaders={"test": rawvalue}, handler=handler) self.assertEquals(h.getHeader("test"), parsedvalue(rawvalue)) h = http_headers.Headers({"test": parsedvalue(rawvalue2)}, handler=handler) self.assertEquals(h.getRawHeaders("test"), rawvalue2) def testImmutable(self): h = http_headers.Headers(handler=HeaderHandler(parsers={}, generators={})) h.makeImmutable() self.assertRaises(AttributeError, h.setRawHeaders, "test", [1]) self.assertRaises(AttributeError, h.setHeader, "test", 1) self.assertRaises(AttributeError, h.removeHeader, "test") class TokenizerTest(unittest.TestCase): """Test header list parsing functions.""" def testParse(self): parser = lambda val: list(http_headers.tokenize([val,])) Token = http_headers.Token tests = (('foo,bar', ['foo', Token(','), 'bar']), ('FOO,BAR', ['foo', Token(','), 'bar']), (' \t foo \t bar \t , \t baz ', ['foo', Token(' '), 'bar', Token(','), 'baz']), ('()<>@,;:\\/[]?={}', [Token('('), Token(')'), Token('<'), Token('>'), Token('@'), Token(','), Token(';'), Token(':'), Token('\\'), Token('/'), Token('['), Token(']'), Token('?'), Token('='), Token('{'), Token('}')]), (' "foo" ', ['foo']), ('"FOO(),\\"BAR,"', ['FOO(),"BAR,'])) raiseTests = ('"open quote', '"ending \\', "control character: \x127", "\x00", "\x1f") for test,result in tests: self.assertEquals(parser(test), result) for test in raiseTests: self.assertRaises(ValueError, parser, test) def testGenerate(self): pass def testRoundtrip(self): pass def atSpecifiedTime(when, func): def inner(*a, **kw): orig = time.time time.time = lambda: when try: return func(*a, **kw) finally: time.time = orig return util.mergeFunctionMetadata(func, inner) def parseHeader(name, val): head = http_headers.Headers(handler=http_headers.DefaultHTTPHandler) head.setRawHeaders(name,val) return head.getHeader(name) parseHeader = atSpecifiedTime(999999990, parseHeader) # Sun, 09 Sep 2001 01:46:30 GMT def generateHeader(name, val): head = http_headers.Headers(handler=http_headers.DefaultHTTPHandler) head.setHeader(name, val) return head.getRawHeaders(name) generateHeader = atSpecifiedTime(999999990, generateHeader) # Sun, 09 Sep 2001 01:46:30 GMT class HeaderParsingTestBase(unittest.TestCase): def runRoundtripTest(self, headername, table): """ Perform some assertions about the behavior of parsing and generating HTTP headers. Specifically: parse an HTTP header value, assert that the parsed form contains all the available information with the correct structure; generate the HTTP header value from the parsed form, assert that it contains certain literal strings; finally, re-parse the generated HTTP header value and assert that the resulting structured data is the same as the first-pass parsed form. @type headername: C{str} @param headername: The name of the HTTP header L{table} contains values for. @type table: A sequence of tuples describing inputs to and outputs from header parsing and generation. The tuples may be either 2 or 3 elements long. In either case: the first element is a string representing an HTTP-format header value; the second element is a dictionary mapping names of parameters to values of those parameters (the parsed form of the header). If there is a third element, it is a list of strings which must occur exactly in the HTTP header value string which is re-generated from the parsed form. """ for row in table: if len(row) == 2: rawHeaderInput, parsedHeaderData = row requiredGeneratedElements = [] elif len(row) == 3: rawHeaderInput, parsedHeaderData, requiredGeneratedElements = row assert isinstance(requiredGeneratedElements, list) # parser parsed = parseHeader(headername, [rawHeaderInput,]) self.assertEquals(parsed, parsedHeaderData) regeneratedHeaderValue = generateHeader(headername, parsed) if requiredGeneratedElements: # generator for regeneratedElement in regeneratedHeaderValue: reqEle = requiredGeneratedElements[regeneratedHeaderValue.index(regeneratedElement)] elementIndex = regeneratedElement.find(reqEle) self.assertNotEqual( elementIndex, -1, "%r did not appear in generated HTTP header %r: %r" % (reqEle, headername, regeneratedElement)) # parser/generator reparsed = parseHeader(headername, regeneratedHeaderValue) self.assertEquals(parsed, reparsed) def invalidParseTest(self, headername, values): for val in values: parsed = parseHeader(headername, val) self.assertEquals(parsed, None) class GeneralHeaderParsingTests(HeaderParsingTestBase): def testCacheControl(self): table = ( ("no-cache", {'no-cache':None}), ("no-cache, no-store, max-age=5, max-stale=3, min-fresh=5, no-transform, only-if-cached, blahblah-extension-thingy", {'no-cache': None, 'no-store': None, 'max-age':5, 'max-stale':3, 'min-fresh':5, 'no-transform':None, 'only-if-cached':None, 'blahblah-extension-thingy':None}), ("max-stale", {'max-stale':None}), ("public, private, no-cache, no-store, no-transform, must-revalidate, proxy-revalidate, max-age=5, s-maxage=10, blahblah-extension-thingy", {'public':None, 'private':None, 'no-cache':None, 'no-store':None, 'no-transform':None, 'must-revalidate':None, 'proxy-revalidate':None, 'max-age':5, 's-maxage':10, 'blahblah-extension-thingy':None}), ('private="Set-Cookie, Set-Cookie2", no-cache="PROXY-AUTHENTICATE"', {'private': ['set-cookie', 'set-cookie2'], 'no-cache': ['proxy-authenticate']}, ['private="Set-Cookie, Set-Cookie2"', 'no-cache="Proxy-Authenticate"']), ) self.runRoundtripTest("Cache-Control", table) def testConnection(self): table = ( ("close", ['close',]), ("close, foo-bar", ['close', 'foo-bar']) ) self.runRoundtripTest("Connection", table) def testDate(self): # Don't need major tests since the datetime parser has its own tests self.runRoundtripTest("Date", (("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000),)) # def testPragma(self): # fail # def testTrailer(self): # fail def testTransferEncoding(self): table = ( ('chunked', ['chunked']), ('gzip, chunked', ['gzip', 'chunked']) ) self.runRoundtripTest("Transfer-Encoding", table) # def testUpgrade(self): # fail # def testVia(self): # fail # def testWarning(self): # fail class RequestHeaderParsingTests(HeaderParsingTestBase): #FIXME test ordering too. def testAccept(self): table = ( ("audio/*;q=0.2, audio/basic", {http_headers.MimeType('audio', '*'): 0.2, http_headers.MimeType('audio', 'basic'): 1.0}), ("text/plain;q=0.5, text/html, text/x-dvi;q=0.8, text/x-c", {http_headers.MimeType('text', 'plain'): 0.5, http_headers.MimeType('text', 'html'): 1.0, http_headers.MimeType('text', 'x-dvi'): 0.8, http_headers.MimeType('text', 'x-c'): 1.0}), ("text/*, text/html, text/html;level=1, */*", {http_headers.MimeType('text', '*'): 1.0, http_headers.MimeType('text', 'html'): 1.0, http_headers.MimeType('text', 'html', (('level', '1'),)): 1.0, http_headers.MimeType('*', '*'): 1.0}), ("text/*;q=0.3, text/html;q=0.7, text/html;level=1, text/html;level=2;q=0.4, */*;q=0.5", {http_headers.MimeType('text', '*'): 0.3, http_headers.MimeType('text', 'html'): 0.7, http_headers.MimeType('text', 'html', (('level', '1'),)): 1.0, http_headers.MimeType('text', 'html', (('level', '2'),)): 0.4, http_headers.MimeType('*', '*'): 0.5}), ) self.runRoundtripTest("Accept", table) def testAcceptCharset(self): table = ( ("iso-8859-5, unicode-1-1;q=0.8", {'iso-8859-5': 1.0, 'iso-8859-1': 1.0, 'unicode-1-1': 0.8}, ["iso-8859-5", "unicode-1-1;q=0.8", "iso-8859-1"]), ("iso-8859-1;q=0.7", {'iso-8859-1': 0.7}), ("*;q=.7", {'*': 0.7}, ["*;q=0.7"]), ("", {'iso-8859-1': 1.0}, ["iso-8859-1"]), # Yes this is an actual change -- we'll say that's okay. :) ) self.runRoundtripTest("Accept-Charset", table) def testAcceptEncoding(self): table = ( ("compress, gzip", {'compress': 1.0, 'gzip': 1.0, 'identity': 0.0001}), ("", {'identity': 0.0001}), ("*", {'*': 1}), ("compress;q=0.5, gzip;q=1.0", {'compress': 0.5, 'gzip': 1.0, 'identity': 0.0001}, ["compress;q=0.5", "gzip"]), ("gzip;q=1.0, identity;q=0.5, *;q=0", {'gzip': 1.0, 'identity': 0.5, '*':0}, ["gzip", "identity;q=0.5", "*;q=0"]), ) self.runRoundtripTest("Accept-Encoding", table) def testAcceptLanguage(self): table = ( ("da, en-gb;q=0.8, en;q=0.7", {'da': 1.0, 'en-gb': 0.8, 'en': 0.7}), ("*", {'*': 1}), ) self.runRoundtripTest("Accept-Language", table) def testAuthorization(self): table = ( ("Basic dXNlcm5hbWU6cGFzc3dvcmQ=", ("basic", "dXNlcm5hbWU6cGFzc3dvcmQ="), ["basic dXNlcm5hbWU6cGFzc3dvcmQ="]), ('Digest nonce="bar", realm="foo", username="baz", response="bax"', ('digest', 'nonce="bar", realm="foo", username="baz", response="bax"'), ['digest', 'nonce="bar"', 'realm="foo"', 'username="baz"', 'response="bax"']) ) self.runRoundtripTest("Authorization", table) def testCookie(self): table = ( ('name=value', [Cookie('name', 'value')]), ('"name"="value"', [Cookie('"name"', '"value"')]), ('name,"blah=value,"', [Cookie('name,"blah', 'value,"')]), ('name,"blah = value," ', [Cookie('name,"blah', 'value,"')], ['name,"blah=value,"']), ("`~!@#$%^&*()-_+[{]}\\|:'\",<.>/?=`~!@#$%^&*()-_+[{]}\\|:'\",<.>/?", [Cookie("`~!@#$%^&*()-_+[{]}\\|:'\",<.>/?", "`~!@#$%^&*()-_+[{]}\\|:'\",<.>/?")]), ('name,"blah = value," ; name2=val2', [Cookie('name,"blah', 'value,"'), Cookie('name2', 'val2')], ['name,"blah=value,"', 'name2=val2']), ) self.runRoundtripTest("Cookie", table) #newstyle RFC2965 Cookie table2 = ( ('$Version="1";' 'name="value";$Path="/foo";$Domain="www.local";$Port="80,8000";' 'name2="value"', [Cookie('name', 'value', path='/foo', domain='www.local', ports=(80,8000), version=1), Cookie('name2', 'value', version=1)]), ('$Version="1";' 'name="value";$Port', [Cookie('name', 'value', ports=(), version=1)]), ('$Version = 1, NAME = "qq\\"qq",Frob=boo', [Cookie('name', 'qq"qq', version=1), Cookie('frob', 'boo', version=1)], ['$Version="1";name="qq\\"qq";frob="boo"']), ) self.runRoundtripTest("Cookie", table2) # Generate only! # make headers by combining oldstyle and newstyle cookies table3 = ( ([Cookie('name', 'value'), Cookie('name2', 'value2', version=1)], '$Version="1";name=value;name2="value2"'), ([Cookie('name', 'value', path="/foo"), Cookie('name2', 'value2', domain="bar.baz", version=1)], '$Version="1";name=value;$Path="/foo";name2="value2";$Domain="bar.baz"'), ([Cookie('invalid,"name', 'value'), Cookie('name2', 'value2', version=1)], '$Version="1";name2="value2"'), ([Cookie('name', 'qq"qq'), Cookie('name2', 'value2', version=1)], '$Version="1";name="qq\\"qq";name2="value2"'), ) for row in table3: self.assertEquals(generateHeader("Cookie", row[0]), [row[1],]) def testSetCookie(self): table = ( ('name,"blah=value,; expires=Sun, 09 Sep 2001 01:46:40 GMT; path=/foo; domain=bar.baz; secure', [Cookie('name,"blah', 'value,', expires=1000000000, path="/foo", domain="bar.baz", secure=True)]), ('name,"blah = value, ; expires="Sun, 09 Sep 2001 01:46:40 GMT"', [Cookie('name,"blah', 'value,', expires=1000000000)], ['name,"blah=value,', 'expires=Sun, 09 Sep 2001 01:46:40 GMT']), ) self.runRoundtripTest("Set-Cookie", table) def testSetCookie2(self): table = ( ('name="value"; Comment="YadaYada"; CommentURL="http://frobnotz/"; Discard; Domain="blah.blah"; Max-Age=10; Path="/foo"; Port="80,8080"; Secure; Version="1"', [Cookie("name", "value", comment="YadaYada", commenturl="http://frobnotz/", discard=True, domain="blah.blah", expires=1000000000, path="/foo", ports=(80,8080), secure=True, version=1)]), ) self.runRoundtripTest("Set-Cookie2", table) def testExpect(self): table = ( ("100-continue", {"100-continue":(None,)}), ('foobar=twiddle', {'foobar':('twiddle',)}), ("foo=bar;a=b;c", {'foo':('bar',('a', 'b'), ('c', None))}) ) self.runRoundtripTest("Expect", table) def testFrom(self): self.runRoundtripTest("From", (("webmaster@w3.org", "webmaster@w3.org"),)) def testHost(self): self.runRoundtripTest("Host", (("www.w3.org", "www.w3.org"),)) def testIfMatch(self): table = ( ('"xyzzy"', [http_headers.ETag('xyzzy')]), ('"xyzzy", "r2d2xxxx", "c3piozzzz"', [http_headers.ETag('xyzzy'), http_headers.ETag('r2d2xxxx'), http_headers.ETag('c3piozzzz')]), ('*', ['*']), ) def testIfModifiedSince(self): # Don't need major tests since the datetime parser has its own test # Just test stupid ; length= brokenness. table = ( ("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000), ("Sun, 09 Sep 2001 01:46:40 GMT; length=500", 1000000000, ["Sun, 09 Sep 2001 01:46:40 GMT"]), ) self.runRoundtripTest("If-Modified-Since", table) def testIfNoneMatch(self): table = ( ('"xyzzy"', [http_headers.ETag('xyzzy')]), ('W/"xyzzy", "r2d2xxxx", "c3piozzzz"', [http_headers.ETag('xyzzy', weak=True), http_headers.ETag('r2d2xxxx'), http_headers.ETag('c3piozzzz')]), ('W/"xyzzy", W/"r2d2xxxx", W/"c3piozzzz"', [http_headers.ETag('xyzzy', weak=True), http_headers.ETag('r2d2xxxx', weak=True), http_headers.ETag('c3piozzzz', weak=True)]), ('*', ['*']), ) self.runRoundtripTest("If-None-Match", table) def testIfRange(self): table = ( ('"xyzzy"', http_headers.ETag('xyzzy')), ('W/"xyzzy"', http_headers.ETag('xyzzy', weak=True)), ('W/"xyzzy"', http_headers.ETag('xyzzy', weak=True)), ("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000), ) self.runRoundtripTest("If-Range", table) def testIfUnmodifiedSince(self): self.runRoundtripTest("If-Unmodified-Since", (("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000),)) def testMaxForwards(self): self.runRoundtripTest("Max-Forwards", (("15", 15),)) # def testProxyAuthorize(self): # fail def testRange(self): table = ( ("bytes=0-499", ('bytes', [(0,499),])), ("bytes=500-999", ('bytes', [(500,999),])), ("bytes=-500",('bytes', [(None,500),])), ("bytes=9500-",('bytes', [(9500, None),])), ("bytes=0-0,-1", ('bytes', [(0,0),(None,1)])), ) self.runRoundtripTest("Range", table) def testReferer(self): self.runRoundtripTest("Referer", (("http://www.w3.org/hypertext/DataSources/Overview.html", "http://www.w3.org/hypertext/DataSources/Overview.html"),)) def testTE(self): table = ( ("deflate", {'deflate':1}), ("", {}), ("trailers, deflate;q=0.5", {'trailers':1, 'deflate':0.5}), ) self.runRoundtripTest("TE", table) def testUserAgent(self): self.runRoundtripTest("User-Agent", (("CERN-LineMode/2.15 libwww/2.17b3", "CERN-LineMode/2.15 libwww/2.17b3"),)) class ResponseHeaderParsingTests(HeaderParsingTestBase): def testAcceptRanges(self): self.runRoundtripTest("Accept-Ranges", (("bytes", ["bytes"]), ("none", ["none"]))) def testAge(self): self.runRoundtripTest("Age", (("15", 15),)) def testETag(self): table = ( ('"xyzzy"', http_headers.ETag('xyzzy')), ('W/"xyzzy"', http_headers.ETag('xyzzy', weak=True)), ('""', http_headers.ETag('')), ) self.runRoundtripTest("ETag", table) def testLocation(self): self.runRoundtripTest("Location", (("http://www.w3.org/pub/WWW/People.htm", "http://www.w3.org/pub/WWW/People.htm"),)) # def testProxyAuthenticate(self): # fail def testRetryAfter(self): # time() is always 999999990 when being tested. table = ( ("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000, ["10"]), ("120", 999999990+120), ) self.runRoundtripTest("Retry-After", table) def testServer(self): self.runRoundtripTest("Server", (("CERN/3.0 libwww/2.17", "CERN/3.0 libwww/2.17"),)) def testVary(self): table = ( ("*", ["*"]), ("Accept, Accept-Encoding", ["accept", "accept-encoding"], ["accept", "accept-encoding"]) ) self.runRoundtripTest("Vary", table) def testWWWAuthenticate(self): digest = ('Digest realm="digest realm", nonce="bAr", qop="auth"', [('Digest', {'realm': 'digest realm', 'nonce': 'bAr', 'qop': 'auth'})], ['Digest', 'realm="digest realm"', 'nonce="bAr"', 'qop="auth"']) basic = ('Basic realm="foo"', [('Basic', {'realm': 'foo'})], ['Basic', 'realm="foo"']) ntlm = ('NTLM', [('NTLM', {})], ['NTLM', '']) negotiate = ('Negotiate SomeGssAPIData', [('Negotiate', 'SomeGssAPIData')], ['Negotiate', 'SomeGssAPIData']) table = (digest, basic, (digest[0]+', '+basic[0], digest[1] + basic[1], [digest[2], basic[2]]), ntlm, negotiate, (ntlm[0]+', '+basic[0], ntlm[1] + basic[1], [ntlm[2], basic[2]]), (digest[0]+', '+negotiate[0], digest[1] + negotiate[1], [digest[2], negotiate[2]]), (negotiate[0]+', '+negotiate[0], negotiate[1] + negotiate[1], [negotiate[2] + negotiate[2]]), (ntlm[0]+', '+ntlm[0], ntlm[1] + ntlm[1], [ntlm[2], ntlm[2]]), (basic[0]+', '+ntlm[0], basic[1] + ntlm[1], [basic[2], ntlm[2]]), ) # runRoundtripTest doesn't work because we don't generate a single # header headername = 'WWW-Authenticate' for row in table: rawHeaderInput, parsedHeaderData, requiredGeneratedElements = row parsed = parseHeader(headername, [rawHeaderInput,]) self.assertEquals(parsed, parsedHeaderData) regeneratedHeaderValue = generateHeader(headername, parsed) for regeneratedElement in regeneratedHeaderValue: requiredElements = requiredGeneratedElements[ regeneratedHeaderValue.index( regeneratedElement)] for reqEle in requiredElements: elementIndex = regeneratedElement.find(reqEle) self.assertNotEqual( elementIndex, -1, "%r did not appear in generated HTTP header %r: %r" % (reqEle, headername, regeneratedElement)) # parser/generator reparsed = parseHeader(headername, regeneratedHeaderValue) self.assertEquals(parsed, reparsed) class EntityHeaderParsingTests(HeaderParsingTestBase): def testAllow(self): # Allow is a silly case-sensitive header unlike all the rest table = ( ("GET", ['GET', ]), ("GET, HEAD, PUT", ['GET', 'HEAD', 'PUT']), ) self.runRoundtripTest("Allow", table) def testContentEncoding(self): table = ( ("gzip", ['gzip',]), ) self.runRoundtripTest("Content-Encoding", table) def testContentLanguage(self): table = ( ("da", ['da',]), ("mi, en", ['mi', 'en']), ) self.runRoundtripTest("Content-Language", table) def testContentLength(self): self.runRoundtripTest("Content-Length", (("15", 15),)) self.invalidParseTest("Content-Length", ("asdf",)) def testContentLocation(self): self.runRoundtripTest("Content-Location", (("http://www.w3.org/pub/WWW/People.htm", "http://www.w3.org/pub/WWW/People.htm"),)) def testContentMD5(self): self.runRoundtripTest("Content-MD5", (("Q2hlY2sgSW50ZWdyaXR5IQ==", "Check Integrity!"),)) self.invalidParseTest("Content-MD5", ("sdlaksjdfhlkaj",)) def testContentRange(self): table = ( ("bytes 0-499/1234", ("bytes", 0, 499, 1234)), ("bytes 500-999/1234", ("bytes", 500, 999, 1234)), ("bytes 500-1233/1234", ("bytes", 500, 1233, 1234)), ("bytes 734-1233/1234", ("bytes", 734, 1233, 1234)), ("bytes 734-1233/*", ("bytes", 734, 1233, None)), ("bytes */1234", ("bytes", None, None, 1234)), ("bytes */*", ("bytes", None, None, None)) ) self.runRoundtripTest("Content-Range", table) def testContentType(self): table = ( ("text/html;charset=iso-8859-4", http_headers.MimeType('text', 'html', (('charset','iso-8859-4'),))), ("text/html", http_headers.MimeType('text', 'html')), ) self.runRoundtripTest("Content-Type", table) def testExpires(self): self.runRoundtripTest("Expires", (("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000),)) # Invalid expires MUST return date in the past. self.assertEquals(parseHeader("Expires", ["0"]), 0) self.assertEquals(parseHeader("Expires", ["wejthnaljn"]), 0) def testLastModified(self): # Don't need major tests since the datetime parser has its own test self.runRoundtripTest("Last-Modified", (("Sun, 09 Sep 2001 01:46:40 GMT", 1000000000),)) class DateTimeTest(unittest.TestCase): """Test date parsing functions.""" def testParse(self): timeNum = 784111777 timeStrs = ('Sun, 06 Nov 1994 08:49:37 GMT', 'Sunday, 06-Nov-94 08:49:37 GMT', 'Sun Nov 6 08:49:37 1994', # Also some non-RFC formats, for good measure. 'Somefakeday 6 Nov 1994 8:49:37', '6 Nov 1994 8:49:37', 'Sun, 6 Nov 1994 8:49:37', '6 Nov 1994 8:49:37 GMT', '06-Nov-94 08:49:37', 'Sunday, 06-Nov-94 08:49:37', '06-Nov-94 08:49:37 GMT', 'Nov 6 08:49:37 1994', ) for timeStr in timeStrs: self.assertEquals(http_headers.parseDateTime(timeStr), timeNum) # Test 2 Digit date wraparound yuckiness. self.assertEquals(http_headers.parseDateTime( 'Monday, 11-Oct-04 14:56:50 GMT'), 1097506610) self.assertEquals(http_headers.parseDateTime( 'Monday, 11-Oct-2004 14:56:50 GMT'), 1097506610) def testGenerate(self): self.assertEquals(http_headers.generateDateTime(784111777), 'Sun, 06 Nov 1994 08:49:37 GMT') def testRoundtrip(self): for i in range(2000): time = random.randint(0, 2000000000) timestr = http_headers.generateDateTime(time) time2 = http_headers.parseDateTime(timestr) self.assertEquals(time, time2) class TestMimeType(unittest.TestCase): def testEquality(self): """Test that various uses of the constructer are equal """ kwargMime = http_headers.MimeType('text', 'plain', key='value', param=None) dictMime = http_headers.MimeType('text', 'plain', {'param': None, 'key': 'value'}) tupleMime = http_headers.MimeType('text', 'plain', (('param', None), ('key', 'value'))) stringMime = http_headers.MimeType.fromString('text/plain;key=value;param') self.assertEquals(kwargMime, dictMime) self.assertEquals(dictMime, tupleMime) self.assertEquals(kwargMime, tupleMime) self.assertEquals(kwargMime, stringMime) TwistedWeb2-8.1.0/twisted/web2/test/test_stream.py0000644000175000017500000004565010514325306020553 0ustar dokodokoimport tempfile, operator, sys, os from twisted.trial import unittest from twisted.internet import reactor, defer, interfaces from twisted.python import log from zope.interface import Interface, Attribute, implements from twisted.python.util import sibpath from twisted.web2 import stream def bufstr(data): try: return str(buffer(data)) except TypeError: raise TypeError("%s doesn't conform to the buffer interface" % (data,)) class SimpleStreamTests: text = '1234567890' def test_split(self): for point in range(10): s = self.makeStream(0) a,b = s.split(point) if point > 0: self.assertEquals(bufstr(a.read()), self.text[:point]) self.assertEquals(a.read(), None) if point < len(self.text): self.assertEquals(bufstr(b.read()), self.text[point:]) self.assertEquals(b.read(), None) for point in range(7): s = self.makeStream(2, 6) self.assertEquals(s.length, 6) a,b = s.split(point) if point > 0: self.assertEquals(bufstr(a.read()), self.text[2:point+2]) self.assertEquals(a.read(), None) if point < 6: self.assertEquals(bufstr(b.read()), self.text[point+2:8]) self.assertEquals(b.read(), None) def test_read(self): s = self.makeStream() self.assertEquals(s.length, len(self.text)) self.assertEquals(bufstr(s.read()), self.text) self.assertEquals(s.read(), None) s = self.makeStream(0, 4) self.assertEquals(s.length, 4) self.assertEquals(bufstr(s.read()), self.text[0:4]) self.assertEquals(s.read(), None) self.assertEquals(s.length, 0) s = self.makeStream(4, 6) self.assertEquals(s.length, 6) self.assertEquals(bufstr(s.read()), self.text[4:10]) self.assertEquals(s.read(), None) self.assertEquals(s.length, 0) class FileStreamTest(SimpleStreamTests, unittest.TestCase): def makeStream(self, *args, **kw): return stream.FileStream(self.f, *args, **kw) def setUpClass(self): f = tempfile.TemporaryFile('w+') f.write(self.text) f.seek(0, 0) self.f = f def test_close(self): s = self.makeStream() s.close() self.assertEquals(s.length, 0) # Make sure close doesn't close file # would raise exception if f is closed self.f.seek(0, 0) def test_read2(self): s = self.makeStream(0) s.CHUNK_SIZE = 6 self.assertEquals(s.length, 10) self.assertEquals(bufstr(s.read()), self.text[0:6]) self.assertEquals(bufstr(s.read()), self.text[6:10]) self.assertEquals(s.read(), None) s = self.makeStream(0) s.CHUNK_SIZE = 5 self.assertEquals(s.length, 10) self.assertEquals(bufstr(s.read()), self.text[0:5]) self.assertEquals(bufstr(s.read()), self.text[5:10]) self.assertEquals(s.read(), None) s = self.makeStream(0, 20) self.assertEquals(s.length, 20) self.assertEquals(bufstr(s.read()), self.text) self.assertRaises(RuntimeError, s.read) # ran out of data class MMapFileStreamTest(SimpleStreamTests, unittest.TestCase): def makeStream(self, *args, **kw): return stream.FileStream(self.f, *args, **kw) def setUpClass(self): f = tempfile.TemporaryFile('w+') self.text = self.text*(stream.MMAP_THRESHOLD//len(self.text) + 1) f.write(self.text) f.seek(0, 0) self.f=f def test_mmapwrapper(self): self.assertRaises(TypeError, stream.mmapwrapper) self.assertRaises(TypeError, stream.mmapwrapper, offset = 0) self.assertRaises(TypeError, stream.mmapwrapper, offset = None) if not stream.mmap: test_mmapwrapper.skip = 'mmap not supported here' class MemoryStreamTest(SimpleStreamTests, unittest.TestCase): def makeStream(self, *args, **kw): return stream.MemoryStream(self.text, *args, **kw) def test_close(self): s = self.makeStream() s.close() self.assertEquals(s.length, 0) def test_read2(self): self.assertRaises(ValueError, self.makeStream, 0, 20) testdata = """I was angry with my friend: I told my wrath, my wrath did end. I was angry with my foe: I told it not, my wrath did grow. And I water'd it in fears, Night and morning with my tears; And I sunned it with smiles, And with soft deceitful wiles. And it grew both day and night, Till it bore an apple bright; And my foe beheld it shine, And he knew that is was mine, And into my garden stole When the night had veil'd the pole: In the morning glad I see My foe outstretch'd beneath the tree""" class TestSubstream(unittest.TestCase): def setUp(self): self.data = testdata self.s = stream.MemoryStream(self.data) def suckTheMarrow(self, s): return ''.join(map(str, list(iter(s.read, None)))) def testStart(self): s = stream.substream(self.s, 0, 11) self.assertEquals('I was angry', self.suckTheMarrow(s)) def testNotStart(self): s = stream.substream(self.s, 12, 26) self.assertEquals('with my friend', self.suckTheMarrow(s)) def testReverseStartEnd(self): self.assertRaises(ValueError, stream.substream, self.s, 26, 12) def testEmptySubstream(self): s = stream.substream(self.s, 11, 11) self.assertEquals('', self.suckTheMarrow(s)) def testEnd(self): size = len(self.data) s = stream.substream(self.s, size-4, size) self.assertEquals('tree', self.suckTheMarrow(s)) def testPastEnd(self): size = len(self.data) self.assertRaises(ValueError, stream.substream, self.s, size-4, size+8) class TestBufferedStream(unittest.TestCase): def setUp(self): self.data = testdata.replace('\n', '\r\n') s = stream.MemoryStream(self.data) self.s = stream.BufferedStream(s) def _cbGotData(self, data, expected): self.assertEqual(data, expected) def test_readline(self): """Test that readline reads a line.""" d = self.s.readline() d.addCallback(self._cbGotData, 'I was angry with my friend:\r\n') return d def test_readlineWithSize(self): """Test the size argument to readline""" d = self.s.readline(size = 5) d.addCallback(self._cbGotData, 'I was') return d def test_readlineWithBigSize(self): """Test the size argument when it's bigger than the length of the line.""" d = self.s.readline(size = 40) d.addCallback(self._cbGotData, 'I was angry with my friend:\r\n') return d def test_readlineWithZero(self): """Test readline with size = 0.""" d = self.s.readline(size = 0) d.addCallback(self._cbGotData, '') return d def test_readlineFinished(self): """Test readline on a finished stream.""" nolines = len(self.data.split('\r\n')) for i in range(nolines): self.s.readline() d = self.s.readline() d.addCallback(self._cbGotData, '') return d def test_readlineNegSize(self): """Ensure that readline with a negative size raises an exception.""" self.assertRaises(ValueError, self.s.readline, size = -1) def test_readlineSizeInDelimiter(self): """ Test behavior of readline when size falls inside the delimiter. """ d = self.s.readline(size=28) d.addCallback(self._cbGotData, "I was angry with my friend:\r") d.addCallback(lambda _: self.s.readline()) d.addCallback(self._cbGotData, "\nI told my wrath, my wrath did end.\r\n") def test_readExactly(self): """Make sure readExactly with no arg reads all the data.""" d = self.s.readExactly() d.addCallback(self._cbGotData, self.data) return d def test_readExactly(self): """Test readExactly with a number.""" d = self.s.readExactly(10) d.addCallback(self._cbGotData, self.data[:10]) return d def test_readExactlyBig(self): """ Test readExactly with a number larger than the size of the datastream. """ d = self.s.readExactly(100000) d.addCallback(self._cbGotData, self.data) return d def test_read(self): """ Make sure read() also functions. (note that this test uses an implementation detail of this particular stream. s.read() isn't guaranteed to return self.data on all streams.) """ self.assertEqual(str(self.s.read()), self.data) class TestStreamer: implements(stream.IStream, stream.IByteStream) length = None readCalled=0 closeCalled=0 def __init__(self, list): self.list = list def read(self): self.readCalled+=1 if self.list: return self.list.pop(0) return None def close(self): self.closeCalled+=1 self.list = [] class FallbackSplitTest(unittest.TestCase): def test_split(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 5) self.assertEquals(left.length, 5) self.assertEquals(right.length, None) self.assertEquals(bufstr(left.read()), 'abcd') d = left.read() d.addCallback(self._cbSplit, left, right) return d def _cbSplit(self, result, left, right): self.assertEquals(bufstr(result), 'e') self.assertEquals(left.read(), None) self.assertEquals(bufstr(right.read().result), 'fgh') self.assertEquals(bufstr(right.read()), 'ijkl') self.assertEquals(right.read(), None) def test_split2(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 4) self.assertEquals(left.length, 4) self.assertEquals(right.length, None) self.assertEquals(bufstr(left.read()), 'abcd') self.assertEquals(left.read(), None) self.assertEquals(bufstr(right.read().result), 'efgh') self.assertEquals(bufstr(right.read()), 'ijkl') self.assertEquals(right.read(), None) def test_splitsplit(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 5) left,middle = left.split(3) self.assertEquals(left.length, 3) self.assertEquals(middle.length, 2) self.assertEquals(right.length, None) self.assertEquals(bufstr(left.read()), 'abc') self.assertEquals(left.read(), None) self.assertEquals(bufstr(middle.read().result), 'd') self.assertEquals(bufstr(middle.read().result), 'e') self.assertEquals(middle.read(), None) self.assertEquals(bufstr(right.read().result), 'fgh') self.assertEquals(bufstr(right.read()), 'ijkl') self.assertEquals(right.read(), None) def test_closeboth(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 5) left.close() self.assertEquals(s.closeCalled, 0) right.close() # Make sure nothing got read self.assertEquals(s.readCalled, 0) self.assertEquals(s.closeCalled, 1) def test_closeboth_rev(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 5) right.close() self.assertEquals(s.closeCalled, 0) left.close() # Make sure nothing got read self.assertEquals(s.readCalled, 0) self.assertEquals(s.closeCalled, 1) def test_closeleft(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 5) left.close() d = right.read() d.addCallback(self._cbCloseleft, right) return d def _cbCloseleft(self, result, right): self.assertEquals(bufstr(result), 'fgh') self.assertEquals(bufstr(right.read()), 'ijkl') self.assertEquals(right.read(), None) def test_closeright(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) left,right = stream.fallbackSplit(s, 3) right.close() self.assertEquals(bufstr(left.read()), 'abc') self.assertEquals(left.read(), None) self.assertEquals(s.closeCalled, 1) class ProcessStreamerTest(unittest.TestCase): if interfaces.IReactorProcess(reactor, None) is None: skip = "Platform lacks spawnProcess support, can't test process streaming." def runCode(self, code, inputStream=None): if inputStream is None: inputStream = stream.MemoryStream("") return stream.ProcessStreamer(inputStream, sys.executable, [sys.executable, "-u", "-c", code], os.environ) def test_output(self): p = self.runCode("import sys\nfor i in range(100): sys.stdout.write('x' * 1000)") l = [] d = stream.readStream(p.outStream, l.append) def verify(_): self.assertEquals("".join(l), ("x" * 1000) * 100) d2 = p.run() return d.addCallback(verify).addCallback(lambda _: d2) def test_errouput(self): p = self.runCode("import sys\nfor i in range(100): sys.stderr.write('x' * 1000)") l = [] d = stream.readStream(p.errStream, l.append) def verify(_): self.assertEquals("".join(l), ("x" * 1000) * 100) p.run() return d.addCallback(verify) def test_input(self): p = self.runCode("import sys\nsys.stdout.write(sys.stdin.read())", "hello world") l = [] d = stream.readStream(p.outStream, l.append) d2 = p.run() def verify(_): self.assertEquals("".join(l), "hello world") return d2 return d.addCallback(verify) def test_badexit(self): p = self.runCode("raise ValueError") l = [] from twisted.internet.error import ProcessTerminated def verify(_): self.assertEquals(l, [1]) self.assert_(p.outStream.closed) self.assert_(p.errStream.closed) return p.run().addErrback(lambda _: _.trap(ProcessTerminated) and l.append(1)).addCallback(verify) def test_inputerror(self): p = self.runCode("import sys\nsys.stdout.write(sys.stdin.read())", TestStreamer(["hello", defer.fail(ZeroDivisionError())])) l = [] d = stream.readStream(p.outStream, l.append) d2 = p.run() def verify(_): self.assertEquals("".join(l), "hello") return d2 return d.addCallback(verify).addCallback(lambda _: log.flushErrors(ZeroDivisionError)) def test_processclosedinput(self): p = self.runCode("import sys; sys.stdout.write(sys.stdin.read(3));" + "sys.stdin.close(); sys.stdout.write('def')", "abc123") l = [] d = stream.readStream(p.outStream, l.append) def verify(_): self.assertEquals("".join(l), "abcdef") d2 = p.run() return d.addCallback(verify).addCallback(lambda _: d2) class AdapterTestCase(unittest.TestCase): def test_adapt(self): fName = self.mktemp() f = file(fName, "w") f.write("test") f.close() for i in ("test", buffer("test"), file(fName)): s = stream.IByteStream(i) self.assertEquals(str(s.read()), "test") self.assertEquals(s.read(), None) class ReadStreamTestCase(unittest.TestCase): def test_pull(self): l = [] s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) return readStream(s, l.append).addCallback( lambda _: self.assertEquals(l, ["abcd", "efgh", "ijkl"])) def test_pullFailure(self): l = [] s = TestStreamer(['abcd', defer.fail(RuntimeError()), 'ijkl']) def test(result): result.trap(RuntimeError) self.assertEquals(l, ["abcd"]) return readStream(s, l.append).addErrback(test) def test_pullException(self): class Failer: def read(self): raise RuntimeError return readStream(Failer(), lambda _: None).addErrback(lambda _: _.trap(RuntimeError)) def test_processingException(self): s = TestStreamer(['abcd', defer.succeed('efgh'), 'ijkl']) return readStream(s, lambda x: 1/0).addErrback(lambda _: _.trap(ZeroDivisionError)) class ProducerStreamTestCase(unittest.TestCase): def test_failfinish(self): p = stream.ProducerStream() p.write("hello") p.finish(RuntimeError()) self.assertEquals(p.read(), "hello") d = p.read() l = [] d.addErrback(lambda _: (l.append(1), _.trap(RuntimeError))).addCallback( lambda _: self.assertEquals(l, [1])) return d from twisted.web2.stream import * class CompoundStreamTest: """ CompoundStream lets you combine many streams into one continuous stream. For example, let's make a stream: >>> s = CompoundStream() Then, add a couple streams: >>> s.addStream(MemoryStream("Stream1")) >>> s.addStream(MemoryStream("Stream2")) The length is the sum of all the streams: >>> s.length 14 We can read data from the stream: >>> str(s.read()) 'Stream1' After having read some data, length is now smaller, as you might expect: >>> s.length 7 So, continue reading... >>> str(s.read()) 'Stream2' Now that the stream is exhausted: >>> s.read() is None True >>> s.length 0 We can also create CompoundStream more easily like so: >>> s = CompoundStream(['hello', MemoryStream(' world')]) >>> str(s.read()) 'hello' >>> str(s.read()) ' world' For a more complicated example, let's try reading from a file: >>> s = CompoundStream() >>> s.addStream(FileStream(open(sibpath(__file__, "stream_data.txt")))) >>> s.addStream("================") >>> s.addStream(FileStream(open(sibpath(__file__, "stream_data.txt")))) Again, the length is the sum: >>> int(s.length) 58 >>> str(s.read()) "We've got some text!\\n" >>> str(s.read()) '================' What if you close the stream? >>> s.close() >>> s.read() is None True >>> s.length 0 Error handling works using Deferreds: >>> m = MemoryStream("after") >>> s = CompoundStream([TestStreamer([defer.fail(ZeroDivisionError())]), m]) >>> l = []; x = s.read().addErrback(lambda _: l.append(1)) >>> l [1] >>> s.length 0 >>> m.length # streams after the failed one got closed 0 """ __doctests__ = ['twisted.web2.test.test_stream', 'twisted.web2.stream'] # TODO: # CompoundStreamTest # more tests for ProducerStreamTest # StreamProducerTest TwistedWeb2-8.1.0/twisted/web2/test/test_plugin.py0000644000175000017500000000240310340730377020550 0ustar dokodokofrom twisted.trial import unittest from twisted.web2 import resource, http from twisted.web2 import plugin class MyDefaultResource(plugin.PluginResource): def render(self, req): http.Response(200, stream='DEFAULT') class TestResourcePlugin(unittest.TestCase): def testResource(self): assert isinstance(plugin.resourcePlugger('TestResource'), plugin.TestResource) def testResourceArguments(self): myPluggedResource = plugin.resourcePlugger('TestResource', 'Foo', bar='Bar') assert isinstance(myPluggedResource, plugin.TestResource) self.assertEquals(myPluggedResource.foo, 'Foo') self.assertEquals(myPluggedResource.bar, 'Bar') def testNoPlugin(self): myPluggedResource = plugin.resourcePlugger('NoSuchResource') assert isinstance(myPluggedResource, plugin.NoPlugin) self.assertEquals(myPluggedResource.plugin, 'NoSuchResource') def testDefaultPlugin(self): myPluggedResource = plugin.resourcePlugger('NoSuchResource', defaultResource=MyDefaultResource) assert isinstance(myPluggedResource, MyDefaultResource) TwistedWeb2-8.1.0/twisted/web2/test/__init__.py0000644000175000017500000000024610362275710017754 0ustar dokodoko# Copyright (c) 2001-2006 Twisted Matrix Laboratories. # See LICENSE for details. """ twisted.web2.test: unittests for the Twisted Web2, Web Server Framework """ TwistedWeb2-8.1.0/twisted/web2/test/server.pem0000644000175000017500000000400010043426411017634 0ustar dokodoko-----BEGIN CERTIFICATE----- MIIDBjCCAm+gAwIBAgIBATANBgkqhkiG9w0BAQQFADB7MQswCQYDVQQGEwJTRzER MA8GA1UEChMITTJDcnlwdG8xFDASBgNVBAsTC00yQ3J5cHRvIENBMSQwIgYDVQQD ExtNMkNyeXB0byBDZXJ0aWZpY2F0ZSBNYXN0ZXIxHTAbBgkqhkiG9w0BCQEWDm5n cHNAcG9zdDEuY29tMB4XDTAwMDkxMDA5NTEzMFoXDTAyMDkxMDA5NTEzMFowUzEL MAkGA1UEBhMCU0cxETAPBgNVBAoTCE0yQ3J5cHRvMRIwEAYDVQQDEwlsb2NhbGhv c3QxHTAbBgkqhkiG9w0BCQEWDm5ncHNAcG9zdDEuY29tMFwwDQYJKoZIhvcNAQEB BQADSwAwSAJBAKy+e3dulvXzV7zoTZWc5TzgApr8DmeQHTYC8ydfzH7EECe4R1Xh 5kwIzOuuFfn178FBiS84gngaNcrFi0Z5fAkCAwEAAaOCAQQwggEAMAkGA1UdEwQC MAAwLAYJYIZIAYb4QgENBB8WHU9wZW5TU0wgR2VuZXJhdGVkIENlcnRpZmljYXRl MB0GA1UdDgQWBBTPhIKSvnsmYsBVNWjj0m3M2z0qVTCBpQYDVR0jBIGdMIGagBT7 hyNp65w6kxXlxb8pUU/+7Sg4AaF/pH0wezELMAkGA1UEBhMCU0cxETAPBgNVBAoT CE0yQ3J5cHRvMRQwEgYDVQQLEwtNMkNyeXB0byBDQTEkMCIGA1UEAxMbTTJDcnlw dG8gQ2VydGlmaWNhdGUgTWFzdGVyMR0wGwYJKoZIhvcNAQkBFg5uZ3BzQHBvc3Qx LmNvbYIBADANBgkqhkiG9w0BAQQFAAOBgQA7/CqT6PoHycTdhEStWNZde7M/2Yc6 BoJuVwnW8YxGO8Sn6UJ4FeffZNcYZddSDKosw8LtPOeWoK3JINjAk5jiPQ2cww++ 7QGG/g5NDjxFZNDJP1dGiLAxPW6JXwov4v0FmdzfLOZ01jDcgQQZqEpYlgpuI5JE WUQ9Ho4EzbYCOQ== -----END CERTIFICATE----- -----BEGIN RSA PRIVATE KEY----- MIIBPAIBAAJBAKy+e3dulvXzV7zoTZWc5TzgApr8DmeQHTYC8ydfzH7EECe4R1Xh 5kwIzOuuFfn178FBiS84gngaNcrFi0Z5fAkCAwEAAQJBAIqm/bz4NA1H++Vx5Ewx OcKp3w19QSaZAwlGRtsUxrP7436QjnREM3Bm8ygU11BjkPVmtrKm6AayQfCHqJoT ZIECIQDW0BoMoL0HOYM/mrTLhaykYAVqgIeJsPjvkEhTFXWBuQIhAM3deFAvWNu4 nklUQ37XsCT2c9tmNt1LAT+slG2JOTTRAiAuXDtC/m3NYVwyHfFm+zKHRzHkClk2 HjubeEgjpj32AQIhAJqMGTaZVOwevTXvvHwNEH+vRWsAYU/gbx+OQB+7VOcBAiEA oolb6NMg/R3enNPvS1O4UU1H8wpaF77L4yiSWlE0p4w= -----END RSA PRIVATE KEY----- -----BEGIN CERTIFICATE REQUEST----- MIIBDTCBuAIBADBTMQswCQYDVQQGEwJTRzERMA8GA1UEChMITTJDcnlwdG8xEjAQ BgNVBAMTCWxvY2FsaG9zdDEdMBsGCSqGSIb3DQEJARYObmdwc0Bwb3N0MS5jb20w XDANBgkqhkiG9w0BAQEFAANLADBIAkEArL57d26W9fNXvOhNlZzlPOACmvwOZ5Ad NgLzJ1/MfsQQJ7hHVeHmTAjM664V+fXvwUGJLziCeBo1ysWLRnl8CQIDAQABoAAw DQYJKoZIhvcNAQEEBQADQQA7uqbrNTjVWpF6By5ZNPvhZ4YdFgkeXFVWi5ao/TaP Vq4BG021fJ9nlHRtr4rotpgHDX1rr+iWeHKsx4+5DRSy -----END CERTIFICATE REQUEST----- TwistedWeb2-8.1.0/twisted/web2/test/test_httpauth.py0000644000175000017500000007664510573065164021141 0ustar dokodokoimport md5 from twisted.internet import address from twisted.trial import unittest from twisted.cred import error from twisted.web2 import http, responsecode from twisted.web2.auth import basic, digest, wrapper from twisted.web2.auth.interfaces import IAuthenticatedRequest, IHTTPUser from twisted.web2.test.test_server import SimpleRequest from twisted.web2.test import test_server import base64 class FakeDigestCredentialFactory(digest.DigestCredentialFactory): """ A Fake Digest Credential Factory that generates a predictable nonce and opaque """ def __init__(self, *args, **kwargs): super(FakeDigestCredentialFactory, self).__init__(*args, **kwargs) self.privateKey = "0" def generateNonce(self): """ Generate a static nonce """ return '178288758716122392881254770685' def _getTime(self): """ Return a stable time """ return 0 class BasicAuthTestCase(unittest.TestCase): def setUp(self): self.credentialFactory = basic.BasicCredentialFactory('foo') self.username = 'dreid' self.password = 'S3CuR1Ty' def testUsernamePassword(self): response = base64.encodestring('%s:%s' % ( self.username, self.password)) creds = self.credentialFactory.decode(response, _trivial_GET) self.failUnless(creds.checkPassword(self.password)) def testIncorrectPassword(self): response = base64.encodestring('%s:%s' % ( self.username, 'incorrectPassword')) creds = self.credentialFactory.decode(response, _trivial_GET) self.failIf(creds.checkPassword(self.password)) def testIncorrectPadding(self): response = base64.encodestring('%s:%s' % ( self.username, self.password)) response = response.strip('=') creds = self.credentialFactory.decode(response, _trivial_GET) self.failUnless(creds.checkPassword(self.password)) def testInvalidCredentials(self): response = base64.encodestring(self.username) self.assertRaises(error.LoginFailed, self.credentialFactory.decode, response, _trivial_GET) clientAddress = address.IPv4Address('TCP', '127.0.0.1', 80) challengeOpaque = ('75c4bd95b96b7b7341c646c6502f0833-MTc4Mjg4NzU' '4NzE2MTIyMzkyODgxMjU0NzcwNjg1LHJlbW90ZWhvc3Q' 'sMA==') challengeNonce = '178288758716122392881254770685' challengeResponse = ('digest', {'nonce': challengeNonce, 'qop': 'auth', 'realm': 'test realm', 'algorithm': 'md5', 'opaque': challengeOpaque}) cnonce = "29fc54aa1641c6fa0e151419361c8f23" authRequest1 = ('username="username", realm="test realm", nonce="%s", ' 'uri="/write/", response="%s", opaque="%s", algorithm="md5", ' 'cnonce="29fc54aa1641c6fa0e151419361c8f23", nc=00000001, ' 'qop="auth"') authRequest2 = ('username="username", realm="test realm", nonce="%s", ' 'uri="/write/", response="%s", opaque="%s", algorithm="md5", ' 'cnonce="29fc54aa1641c6fa0e151419361c8f23", nc=00000002, ' 'qop="auth"') namelessAuthRequest = 'realm="test realm",nonce="doesn\'t matter"' class DigestAuthTestCase(unittest.TestCase): """ Test the behavior of DigestCredentialFactory """ def setUp(self): """ Create a DigestCredentialFactory for testing """ self.credentialFactory = digest.DigestCredentialFactory('md5', 'test realm') def getDigestResponse(self, challenge, ncount): """ Calculate the response for the given challenge """ nonce = challenge.get('nonce') algo = challenge.get('algorithm').lower() qop = challenge.get('qop') expected = digest.calcResponse( digest.calcHA1(algo, "username", "test realm", "password", nonce, cnonce), algo, nonce, ncount, cnonce, qop, "GET", "/write/", None ) return expected def test_getChallenge(self): """ Test that all the required fields exist in the challenge, and that the information matches what we put into our DigestCredentialFactory """ challenge = self.credentialFactory.getChallenge(clientAddress) self.assertEquals(challenge['qop'], 'auth') self.assertEquals(challenge['realm'], 'test realm') self.assertEquals(challenge['algorithm'], 'md5') self.assertTrue(challenge.has_key("nonce")) self.assertTrue(challenge.has_key("opaque")) def test_response(self): """ Test that we can decode a valid response to our challenge """ challenge = self.credentialFactory.getChallenge(clientAddress) clientResponse = authRequest1 % ( challenge['nonce'], self.getDigestResponse(challenge, "00000001"), challenge['opaque']) creds = self.credentialFactory.decode(clientResponse, _trivial_GET) self.failUnless(creds.checkPassword('password')) def test_multiResponse(self): """ Test that multiple responses to to a single challenge are handled successfully. """ challenge = self.credentialFactory.getChallenge(clientAddress) clientResponse = authRequest1 % ( challenge['nonce'], self.getDigestResponse(challenge, "00000001"), challenge['opaque']) creds = self.credentialFactory.decode(clientResponse, _trivial_GET) self.failUnless(creds.checkPassword('password')) clientResponse = authRequest2 % ( challenge['nonce'], self.getDigestResponse(challenge, "00000002"), challenge['opaque']) creds = self.credentialFactory.decode(clientResponse, _trivial_GET) self.failUnless(creds.checkPassword('password')) def test_failsWithDifferentMethod(self): """ Test that the response fails if made for a different request method than it is being issued for. """ challenge = self.credentialFactory.getChallenge(clientAddress) clientResponse = authRequest1 % ( challenge['nonce'], self.getDigestResponse(challenge, "00000001"), challenge['opaque']) creds = self.credentialFactory.decode(clientResponse, SimpleRequest(None, 'POST', '/')) self.failIf(creds.checkPassword('password')) def test_noUsername(self): """ Test that login fails when our response does not contain a username, or the username field is empty. """ # Check for no username e = self.assertRaises(error.LoginFailed, self.credentialFactory.decode, namelessAuthRequest, _trivial_GET) self.assertEquals(str(e), "Invalid response, no username given.") # Check for an empty username e = self.assertRaises(error.LoginFailed, self.credentialFactory.decode, namelessAuthRequest + ',username=""', _trivial_GET) self.assertEquals(str(e), "Invalid response, no username given.") def test_noNonce(self): """ Test that login fails when our response does not contain a nonce """ e = self.assertRaises(error.LoginFailed, self.credentialFactory.decode, 'realm="Test",username="Foo",opaque="bar"', _trivial_GET) self.assertEquals(str(e), "Invalid response, no nonce given.") def test_noOpaque(self): """ Test that login fails when our response does not contain a nonce """ e = self.assertRaises(error.LoginFailed, self.credentialFactory.decode, 'realm="Test",username="Foo"', _trivial_GET) self.assertEquals(str(e), "Invalid response, no opaque given.") def test_checkHash(self): """ Check that given a hash of the form 'username:realm:password' we can verify the digest challenge """ challenge = self.credentialFactory.getChallenge(clientAddress) clientResponse = authRequest1 % ( challenge['nonce'], self.getDigestResponse(challenge, "00000001"), challenge['opaque']) creds = self.credentialFactory.decode(clientResponse, _trivial_GET) self.failUnless(creds.checkHash( md5.md5('username:test realm:password').hexdigest())) self.failIf(creds.checkHash( md5.md5('username:test realm:bogus').hexdigest())) def test_invalidOpaque(self): """ Test that login fails when the opaque does not contain all the required parts. """ credentialFactory = FakeDigestCredentialFactory('md5', 'test realm') challenge = credentialFactory.getChallenge(clientAddress) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, 'badOpaque', challenge['nonce'], clientAddress.host) badOpaque = ('foo-%s' % ( 'nonce,clientip'.encode('base64').strip('\n'),)) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, badOpaque, challenge['nonce'], clientAddress.host) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, '', challenge['nonce'], clientAddress.host) def test_incompatibleNonce(self): """ Test that login fails when the given nonce from the response, does not match the nonce encoded in the opaque. """ credentialFactory = FakeDigestCredentialFactory('md5', 'test realm') challenge = credentialFactory.getChallenge(clientAddress) badNonceOpaque = credentialFactory.generateOpaque( '1234567890', clientAddress.host) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, badNonceOpaque, challenge['nonce'], clientAddress.host) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, badNonceOpaque, '', clientAddress.host) def test_incompatibleClientIp(self): """ Test that the login fails when the request comes from a client ip other than what is encoded in the opaque. """ credentialFactory = FakeDigestCredentialFactory('md5', 'test realm') challenge = credentialFactory.getChallenge(clientAddress) badNonceOpaque = credentialFactory.generateOpaque( challenge['nonce'], '10.0.0.1') self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, badNonceOpaque, challenge['nonce'], clientAddress.host) def test_oldNonce(self): """ Test that the login fails when the given opaque is older than DigestCredentialFactory.CHALLENGE_LIFETIME_SECS """ credentialFactory = FakeDigestCredentialFactory('md5', 'test realm') challenge = credentialFactory.getChallenge(clientAddress) key = '%s,%s,%s' % (challenge['nonce'], clientAddress.host, '-137876876') digest = md5.new(key + credentialFactory.privateKey).hexdigest() ekey = key.encode('base64') oldNonceOpaque = '%s-%s' % (digest, ekey.strip('\n')) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, oldNonceOpaque, challenge['nonce'], clientAddress.host) def test_mismatchedOpaqueChecksum(self): """ Test that login fails when the opaque checksum fails verification """ credentialFactory = FakeDigestCredentialFactory('md5', 'test realm') challenge = credentialFactory.getChallenge(clientAddress) key = '%s,%s,%s' % (challenge['nonce'], clientAddress.host, '0') digest = md5.new(key + 'this is not the right pkey').hexdigest() badChecksum = '%s-%s' % (digest, key.encode('base64').strip('\n')) self.assertRaises( error.LoginFailed, credentialFactory.verifyOpaque, badChecksum, challenge['nonce'], clientAddress.host) def test_incompatibleCalcHA1Options(self): """ Test that the appropriate error is raised when any of the pszUsername, pszRealm, or pszPassword arguments are specified with the preHA1 keyword argument. """ arguments = ( ("user", "realm", "password", "preHA1"), (None, "realm", None, "preHA1"), (None, None, "password", "preHA1"), ) for pszUsername, pszRealm, pszPassword, preHA1 in arguments: self.assertRaises( TypeError, digest.calcHA1, "md5", pszUsername, pszRealm, pszPassword, "nonce", "cnonce", preHA1=preHA1 ) from zope.interface import implements from twisted.cred import portal, checkers class TestHTTPUser(object): """ Test avatar implementation for http auth with cred """ implements(IHTTPUser) username = None def __init__(self, username): """ @param username: The str username sent as part of the HTTP auth response. """ self.username = username class TestAuthRealm(object): """ Test realm that supports the IHTTPUser interface """ implements(portal.IRealm) def requestAvatar(self, avatarId, mind, *interfaces): if IHTTPUser in interfaces: if avatarId == checkers.ANONYMOUS: return IHTTPUser, TestHTTPUser('anonymous') return IHTTPUser, TestHTTPUser(avatarId) raise NotImplementedError("Only IHTTPUser interface is supported") class ProtectedResource(test_server.BaseTestResource): """ A test resource for use with HTTPAuthWrapper that holds on to it's request and segments so we can assert things about them. """ addSlash = True request = None segments = None def render(self, req): self.request = req return super(ProtectedResource, self).render(req) def locateChild(self, req, segments): self.segments = segments return super(ProtectedResource, self).locateChild(req, segments) class NonAnonymousResource(test_server.BaseTestResource): """ A resource that forces authentication by raising an HTTPError with an UNAUTHORIZED code if the request is an anonymous one. """ addSlash = True sendOwnHeaders = False def render(self, req): if req.avatar.username == 'anonymous': if not self.sendOwnHeaders: raise http.HTTPError(responsecode.UNAUTHORIZED) else: return http.Response( responsecode.UNAUTHORIZED, {'www-authenticate': [('basic', {'realm': 'foo'})]}) else: return super(NonAnonymousResource, self).render(req) class HTTPAuthResourceTest(test_server.BaseCase): """ Tests for the HTTPAuthWrapper Resource """ def setUp(self): """ Create a portal and add an in memory checker to it. Then set up a protectedResource that will be wrapped in each test. """ self.portal = portal.Portal(TestAuthRealm()) c = checkers.InMemoryUsernamePasswordDatabaseDontUse() c.addUser('username', 'password') self.portal.registerChecker(c) self.credFactory = basic.BasicCredentialFactory('test realm') self.protectedResource = ProtectedResource() self.protectedResource.responseText = "You shouldn't see me." def tearDown(self): """ Clean up by getting rid of the portal, credentialFactory, and protected resource """ del self.portal del self.credFactory del self.protectedResource def test_authenticatedRequest(self): """ Test that after successful authentication the request provides IAuthenticatedRequest and that the request.avatar implements the proper interfaces for this realm and has the proper values for this request. """ self.protectedResource.responseText = "I hope you can see me." root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) credentials = base64.encodestring('username:password') d = self.assertResponse((root, 'http://localhost/', {'authorization': ('basic', credentials)}), (200, {}, 'I hope you can see me.')) def checkRequest(result): resource = self.protectedResource self.failUnless(hasattr(resource, "request")) request = resource.request self.failUnless(IAuthenticatedRequest.providedBy(request)) self.failUnless(hasattr(request, "avatar")) self.failUnless(IHTTPUser.providedBy(request.avatar)) self.failUnless(hasattr(request, "avatarInterface")) self.assertEquals(request.avatarInterface, IHTTPUser) self.assertEquals(request.avatar.username, 'username') d.addCallback(checkRequest) return d def test_allowedMethods(self): """ Test that unknown methods result in a 401 instead of a 405 when authentication hasn't been completed. """ self.method = 'PROPFIND' root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) d = self.assertResponse( (root, 'http://localhost/'), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) self.method = 'GET' return d def test_unauthorizedResponse(self): """ Test that a request with no credentials results in a valid Unauthorized response. """ root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) def makeDeepRequest(res): return self.assertResponse( (root, 'http://localhost/foo/bar/baz/bax'), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) d = self.assertResponse( (root, 'http://localhost/'), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) return d.addCallback(makeDeepRequest) def test_badCredentials(self): """ Test that a request with bad credentials results in a valid Unauthorized response """ root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) credentials = base64.encodestring('bad:credentials') d = self.assertResponse( (root, 'http://localhost/', {'authorization': [('basic', credentials)]}), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) return d def test_successfulLogin(self): """ Test that a request with good credentials results in the appropriate response from the protected resource """ self.protectedResource.responseText = "I hope you can see me." root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) credentials = base64.encodestring('username:password') d = self.assertResponse((root, 'http://localhost/', {'authorization': ('basic', credentials)}), (200, {}, 'I hope you can see me.')) return d def test_wrongScheme(self): """ Test that a request with credentials for a scheme that is not advertised by this resource results in the appropriate unauthorized response. """ root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) d = self.assertResponse((root, 'http://localhost/', {'authorization': [('digest', 'realm="foo", response="crap"')]}), (401, {'www-authenticate': [('basic', {'realm': 'test realm'})]}, None)) return d def test_multipleWWWAuthenticateSchemes(self): """ Test that our unauthorized response can contain challenges for multiple authentication schemes. """ root = wrapper.HTTPAuthResource( self.protectedResource, (basic.BasicCredentialFactory('test realm'), FakeDigestCredentialFactory('md5', 'test realm')), self.portal, interfaces=(IHTTPUser,)) d = self.assertResponse((root, 'http://localhost/', {}), (401, {'www-authenticate': [challengeResponse, ('basic', {'realm': 'test realm'})]}, None)) return d def test_authorizationAgainstMultipleSchemes(self): """ Test that we can successfully authenticate when presented with multiple WWW-Authenticate headers """ root = wrapper.HTTPAuthResource( self.protectedResource, (basic.BasicCredentialFactory('test realm'), FakeDigestCredentialFactory('md5', 'test realm')), self.portal, interfaces=(IHTTPUser,)) def respondBasic(ign): credentials = base64.encodestring('username:password') d = self.assertResponse((root, 'http://localhost/', {'authorization': ('basic', credentials)}), (200, {}, None)) return d def respond(ign): d = self.assertResponse((root, 'http://localhost/', {'authorization': authRequest1}), (200, {}, None)) return d.addCallback(respondBasic) d = self.assertResponse((root, 'http://localhost/', {}), (401, {'www-authenticate': [challengeResponse, ('basic', {'realm': 'test realm'})]}, None)) return d def test_wrappedResourceGetsFullSegments(self): """ Test that the wrapped resource gets all the URL segments in it's locateChild. """ self.protectedResource.responseText = "I hope you can see me." root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) credentials = base64.encodestring('username:password') d = self.assertResponse((root, 'http://localhost/foo/bar/baz/bax', {'authorization': ('basic', credentials)}), (404, {}, None)) def checkSegments(ign): resource = self.protectedResource self.assertEquals(resource.segments, ['foo', 'bar', 'baz', 'bax']) d.addCallback(checkSegments) return d def test_invalidCredentials(self): """ Malformed or otherwise invalid credentials (as determined by the credential factory) should result in an Unauthorized response """ root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) credentials = base64.encodestring('Not Good Credentials') d = self.assertResponse((root, 'http://localhost/', {'authorization': ('basic', credentials)}), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) return d def test_anonymousAuthentication(self): """ If our portal has a credentials checker for IAnonymous credentials authentication succeeds if no Authorization header is present """ self.portal.registerChecker(checkers.AllowAnonymousAccess()) self.protectedResource.responseText = "Anonymous access allowed" root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces=(IHTTPUser,)) def _checkRequest(ign): self.assertEquals( self.protectedResource.request.avatar.username, 'anonymous') d = self.assertResponse((root, 'http://localhost/', {}), (200, {}, "Anonymous access allowed")) d.addCallback(_checkRequest) return d def test_forceAuthentication(self): """ Test that if an HTTPError with an Unauthorized status code is raised from within our protected resource, we add the WWW-Authenticate headers if they do not already exist. """ self.portal.registerChecker(checkers.AllowAnonymousAccess()) nonAnonResource = NonAnonymousResource() nonAnonResource.responseText = "We don't like anonymous users" root = wrapper.HTTPAuthResource(nonAnonResource, [self.credFactory], self.portal, interfaces = (IHTTPUser,)) def _tryAuthenticate(result): credentials = base64.encodestring('username:password') d2 = self.assertResponse( (root, 'http://localhost/', {'authorization': ('basic', credentials)}), (200, {}, "We don't like anonymous users")) return d2 d = self.assertResponse( (root, 'http://localhost/', {}), (401, {'WWW-Authenticate': [('basic', {'realm': "test realm"})]}, None)) d.addCallback(_tryAuthenticate) return d def test_responseFilterDoesntClobberHeaders(self): """ Test that if an UNAUTHORIZED response is returned and already has 'WWW-Authenticate' headers we don't add them. """ self.portal.registerChecker(checkers.AllowAnonymousAccess()) nonAnonResource = NonAnonymousResource() nonAnonResource.responseText = "We don't like anonymous users" nonAnonResource.sendOwnHeaders = True root = wrapper.HTTPAuthResource(nonAnonResource, [self.credFactory], self.portal, interfaces = (IHTTPUser,)) d = self.assertResponse( (root, 'http://localhost/', {}), (401, {'WWW-Authenticate': [('basic', {'realm': "foo"})]}, None)) return d def test_renderHTTP(self): """ Test that if the renderHTTP method is ever called we authenticate the request and delegate rendering to the wrapper. """ self.protectedResource.responseText = "I hope you can see me." self.protectedResource.addSlash = True root = wrapper.HTTPAuthResource(self.protectedResource, [self.credFactory], self.portal, interfaces = (IHTTPUser,)) request = SimpleRequest(None, "GET", "/") request.prepath = [''] def _gotSecondResponse(response): self.assertEquals(response.code, 200) self.assertEquals(str(response.stream.read()), "I hope you can see me.") def _gotResponse(exception): response = exception.response self.assertEquals(response.code, 401) self.failUnless(response.headers.hasHeader('WWW-Authenticate')) self.assertEquals(response.headers.getHeader('WWW-Authenticate'), [('basic', {'realm': "test realm"})]) credentials = base64.encodestring('username:password') request.headers.setHeader('authorization', ['basic', credentials]) d = root.renderHTTP(request) d.addCallback(_gotSecondResponse) d = self.assertFailure(root.renderHTTP(request), http.HTTPError) d.addCallback(_gotResponse) return d _trivial_GET = SimpleRequest(None, 'GET', '/') TwistedWeb2-8.1.0/twisted/web2/test/test_cgi.py0000644000175000017500000002004610566345211020016 0ustar dokodokoimport sys, os from twisted.trial import unittest from twisted.internet import reactor, interfaces, defer from twisted.python import util from twisted.web2 import twcgi, server, http, iweb from twisted.web2 import stream from twisted.web2.test.test_server import SimpleRequest DUMMY_CGI = ''' print "Header: OK" print print "cgi output" ''' READINPUT_CGI = ''' # this is an example of a correctly-written CGI script which reads a body # from stdin, which only reads env['CONTENT_LENGTH'] bytes. import os, sys body_length = int(os.environ.get('CONTENT_LENGTH',0)) indata = sys.stdin.read(body_length) print "Header: OK" print print "readinput ok" ''' READALLINPUT_CGI = ''' # this is an example of the typical (incorrect) CGI script which expects # the server to close stdin when the body of the request is complete. # A correct CGI should only read env['CONTENT_LENGTH'] bytes. import sys indata = sys.stdin.read() print "Header: OK" print print "readallinput ok" ''' def readStreamToString(s): """ Read all data from a stream into a string. @param s: a L{twisted.web2.stream.IByteStream} to read from. @return: a L{Deferred} results in a str """ allData = [] def gotData(data): allData.append(data) d = stream.readStream(s, gotData) d.addCallback(lambda ign: ''.join(allData)) return d class PythonScript(twcgi.FilteredScript): """ A specialized FilteredScript that just runs its file in a python interpreter. """ filters = (sys.executable,) # web2's version class CGITestBase(unittest.TestCase): """ Base class for CGI using tests """ def setUpResource(self, cgi): """ Set up the cgi resource to be tested. @param cgi: A string containing a Python CGI script. @return: A L{PythonScript} instance """ cgiFilename = os.path.abspath(self.mktemp()) cgiFile = file(cgiFilename, 'wt') cgiFile.write(cgi) cgiFile.close() return PythonScript(cgiFilename) def getPage(self, request, resource): """ Return the body of the given resource for the given request @param request: A L{SimpleRequest} instance to act on the resource @param resource: A L{IResource} to be rendered @return: A L{Deferred} that fires with the response body returned by resource for the request """ d = defer.maybeDeferred(resource.renderHTTP, request) d.addCallback(lambda resp: readStreamToString(resp.stream)) return d class CGI(CGITestBase): """ Test cases for basic twcgi.FilteredScript functionality """ def test_CGI(self): """ Test that the given DUMMY_CGI is executed and the expected output returned """ request = SimpleRequest(None, 'GET', '/cgi') resource = self.setUpResource(DUMMY_CGI) d = self.getPage(request, resource) d.addCallback(self._testCGI_1) return d def _testCGI_1(self, res): self.failUnlessEqual(res, "cgi output%s" % os.linesep) def testReadEmptyInput(self): """ Test that the CGI can successfully read from an empty input stream """ request = SimpleRequest(None, 'GET', '/cgi') resource = self.setUpResource(READINPUT_CGI) d = self.getPage(request, resource) d.addCallback(self._testReadEmptyInput_1) return d def _testReadEmptyInput_1(self, res): self.failUnlessEqual(res, "readinput ok%s" % os.linesep) def test_readInput(self): """ Test that we can successfully read an input stream with data """ request = SimpleRequest(None, "POST", "/cgi", content="Here is your stdin") resource = self.setUpResource(READINPUT_CGI) d = self.getPage(request, resource) d.addCallback(self._testReadInput_1) return d def _testReadInput_1(self, res): self.failUnlessEqual(res, "readinput ok%s" % os.linesep) def test_readAllInput(self): """ Test that we can all input can be read regardless of CONTENT_LENGTH """ request = SimpleRequest(None, "POST", "/cgi", content="Here is your stdin") resource = self.setUpResource(READALLINPUT_CGI) d = self.getPage(request, resource) d.addCallback(self._testReadAllInput_1) return d def _testReadAllInput_1(self, res): self.failUnlessEqual(res, "readallinput ok%s" % os.linesep) if not interfaces.IReactorProcess.providedBy(reactor): CGI.skip = "CGI tests require a functional reactor.spawnProcess()" class CGIDirectoryTest(CGITestBase): """ Test cases for twisted.web2.twcgi.CGIDirectory """ def setUp(self): temp = self.mktemp() os.mkdir(temp) cgiFile = open(os.path.join(temp, 'dummy'), 'wt') cgiFile.write(DUMMY_CGI) cgiFile.close() os.mkdir(os.path.join(temp, 'directory')) self.root = twcgi.CGIDirectory(temp) def test_notFound(self): """ Correctly handle non-existant children by returning a 404 """ self.assertRaises(http.HTTPError, self.root.locateChild, None, ('notHere',)) def test_cantRender(self): """ We do not support directory listing of CGIDirectories So our render method should always return a 403 """ response = self.root.render(None) self.failUnless(iweb.IResponse.providedBy(response)) self.assertEquals(response.code, 403) def test_foundScript(self): """ We should get twcgi.CGISCript instances when we locate a CGI """ resource, segments = self.root.locateChild(None, ('dummy',)) self.assertEquals(segments, ()) self.failUnless(isinstance(resource, (twcgi.CGIScript,))) def test_subDirectory(self): """ When a subdirectory is request we should get another CGIDirectory """ resource, segments = self.root.locateChild(None, ('directory', 'paths', 'that', 'dont', 'matter')) self.failUnless(isinstance(resource, twcgi.CGIDirectory)) def createScript(self, filename): """ Write a dummy cgi script @param filename: a str destination for the cgi """ cgiFile = open(filename, 'wt') cgiFile.write("#!%s\n\n%s" % (sys.executable, DUMMY_CGI)) cgiFile.close() os.chmod(filename, 0700) def test_scriptsExecute(self): """ Verify that CGI scripts within a CGIDirectory can actually be executed """ cgiBinDir = os.path.abspath(self.mktemp()) os.mkdir(cgiBinDir) root = twcgi.CGIDirectory(cgiBinDir) self.createScript(os.path.join(cgiBinDir, 'dummy')) cgiSubDir = os.path.join(cgiBinDir, 'sub') os.mkdir(cgiSubDir) self.createScript(os.path.join(cgiSubDir, 'dummy')) site = server.Site(root) request = SimpleRequest(site, "GET", "/dummy") d = request.locateResource('/dummy') def _firstResponse(res): self.failUnlessEqual(res, "cgi output%s" % os.linesep) def _firstRequest(resource): d1 = self.getPage(request, resource) d1.addCallback(_firstResponse) return d1 d.addCallback(_firstRequest) def _secondResponse(res): self.failUnlessEqual(res, "cgi output%s" % os.linesep) def _secondRequest(ign): request = SimpleRequest(site, "GET", '/sub/dummy') d2 = request.locateResource('/sub/dummy') d2.addCallback(lambda resource: self.getPage(request, resource)) d2.addCallback(_secondResponse) return d2 d.addCallback(_secondRequest) return d TwistedWeb2-8.1.0/twisted/web2/test/test_vhost.py0000644000175000017500000001411010437424617020417 0ustar dokodokofrom twisted.web2.test.test_server import BaseCase, BaseTestResource from twisted.web2 import resource from twisted.web2 import vhost from twisted.web2 import http, responsecode from twisted.web2 import iweb from twisted.web2 import stream from twisted.web2 import http_headers class HostResource(BaseTestResource): addSlash=True def child_bar(self, req): return self def render(self, req): h = req.host return http.Response(responsecode.OK, stream=h) class TestVhost(BaseCase): root = vhost.NameVirtualHost(default=HostResource()) def setUp(self): self.root.addHost('foo', HostResource()) def testNameVirtualHost(self): """ Test basic Name Virtual Host behavior 1) NameVirtualHost.default is defined, so an undefined NVH (localhost) gets handled by NameVirtualHost.default 2) A defined NVH gets passed the proper host header and is handled by the proper resource """ self.assertResponse( (self.root, 'http://localhost/'), (200, {}, 'localhost')) self.assertResponse( (self.root, 'http://foo/'), (200, {}, 'foo')) def testNoDefault(self): root = vhost.NameVirtualHost() # Test lack of host specified self.assertResponse( (root, 'http://frob/'), (404, {}, None)) def testNameVirtualHostWithChildren(self): """ Test that children of a defined NVH are handled appropriately """ self.assertResponse( (self.root, 'http://foo/bar/'), (200, {}, 'foo')) def testNameVirtualHostWithNesting(self): """ Test that an unknown virtual host gets handled by the domain parent and passed on to the parent's resource. """ nested = vhost.NameVirtualHost() nested.addHost('is.nested', HostResource()) self.root.addHost('nested', nested) self.assertResponse( (self.root, 'http://is.nested/'), (200, {}, 'is.nested')) class PathResource(resource.LeafResource): def render(self, req): response = req.scheme+'://'+'/'.join([req.host,] + req.prepath + req.postpath) return http.Response(responsecode.OK, stream=response) class TestURIRewrite(BaseCase): def testVHostURIRewrite(self): """Test that the hostname, path, and scheme are properly rewritten to defined domain """ vur = vhost.VHostURIRewrite('https://www.apachesucks.org/some/path/', PathResource()) self.assertResponse( (vur, 'http://localhost/'), (200, {}, 'https://www.apachesucks.org/some/path/')) def testVHostURIRewriteWithChildren(self): """ Test that the hostname is properly rewritten and that children are located """ vur = vhost.VHostURIRewrite('http://www.apachesucks.org/', HostResource(children=[('foo', PathResource())])) self.assertResponse( (vur, 'http://localhost/foo'), (200, {}, 'http://www.apachesucks.org/foo')) def testVHostURIRewriteAsChild(self): """ Test that a VHostURIRewrite can exist anywhere in the resource tree """ root = HostResource(children=[('bar', HostResource(children=[ ('vhost.rpy', vhost.VHostURIRewrite('http://www.apachesucks.org/', PathResource() ))]))]) self.assertResponse( (root, 'http://localhost/bar/vhost.rpy/foo'), (200, {}, 'http://www.apachesucks.org/foo')) def testVHostURIRewriteWithSibling(self): """ Test that two VHostURIRewrite objects can exist on the same level of the resource tree. """ root = HostResource(children=[ ('vhost1', vhost.VHostURIRewrite('http://foo.bar/', PathResource())), ('vhost2', vhost.VHostURIRewrite('http://baz.bax/', PathResource()))]) self.assertResponse( (root, 'http://localhost/vhost1/'), (200, {}, 'http://foo.bar/')) self.assertResponse( (root, 'http://localhost/vhost2/'), (200, {}, 'http://baz.bax/')) def raw(d): headers=http_headers.Headers() for k,v in d.iteritems(): headers.setRawHeaders(k, [v]) return headers class RemoteAddrResource(resource.LeafResource): def render(self, req): return http.Response(200, stream=str(req.remoteAddr)) class TestAutoVHostRewrite(BaseCase): def setUp(self): self.root = vhost.AutoVHostURIRewrite(PathResource()) def testFullyRewrite(self): self.assertResponse( (self.root, 'http://localhost/quux', raw({'x-forwarded-host':'foo.bar', 'x-forwarded-for':'1.2.3.4', 'x-app-location':'/baz/', 'x-app-scheme':'https'})), (200, {}, 'https://foo.bar/baz/quux')) def testRemoteAddr(self): self.assertResponse( (vhost.AutoVHostURIRewrite(RemoteAddrResource()), 'http://localhost/', raw({'x-forwarded-host':'foo.bar', 'x-forwarded-for':'1.2.3.4'})), (200, {}, "IPv4Address(TCP, '1.2.3.4', 0)")) def testSendsRealHost(self): self.assertResponse( (vhost.AutoVHostURIRewrite(PathResource(), sendsRealHost=True), 'http://localhost/', raw({'host': 'foo.bar', 'x-forwarded-host': 'baz.bax', 'x-forwarded-for': '1.2.3.4'})), (200, {}, 'http://foo.bar/')) def testLackingHeaders(self): self.assertResponse( (self.root, 'http://localhost/', {}), (400, {}, None)) def testMinimalHeaders(self): self.assertResponse( (self.root, 'http://localhost/', raw({'x-forwarded-host':'foo.bar', 'x-forwarded-for':'1.2.3.4'})), (200, {}, 'http://foo.bar/')) TwistedWeb2-8.1.0/twisted/web2/test/test_xmlrpc.py0000644000175000017500000002073410535453244020567 0ustar dokodoko# -*- test-case-name: twisted.web.test.test_xmlrpc -*- # # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. # """Test XML-RPC support.""" import xmlrpclib from twisted.web2 import xmlrpc from twisted.web2.xmlrpc import XMLRPC, addIntrospection from twisted.internet import defer from twisted.web2.test.test_server import BaseCase class TestRuntimeError(RuntimeError): """ Fake RuntimeError for testing purposes. """ class TestValueError(ValueError): """ Fake ValueError for testing purposes. """ class XMLRPCTestResource(XMLRPC): """ This is the XML-RPC "server" against which the tests will be run. """ FAILURE = 666 NOT_FOUND = 23 SESSION_EXPIRED = 42 addSlash = True # cause it's at the root # the doc string is part of the test def xmlrpc_add(self, request, a, b): """This function add two numbers.""" return a + b xmlrpc_add.signature = [['int', 'int', 'int'], ['double', 'double', 'double']] # the doc string is part of the test def xmlrpc_pair(self, request, string, num): """This function puts the two arguments in an array.""" return [string, num] xmlrpc_pair.signature = [['array', 'string', 'int']] # the doc string is part of the test def xmlrpc_defer(self, request, x): """Help for defer.""" return defer.succeed(x) def xmlrpc_deferFail(self, request): return defer.fail(TestValueError()) # don't add a doc string, it's part of the test def xmlrpc_fail(self, request): raise TestRuntimeError def xmlrpc_fault(self, request): return xmlrpc.Fault(12, "hello") def xmlrpc_deferFault(self, request): return defer.fail(xmlrpc.Fault(17, "hi")) def xmlrpc_complex(self, request): return {"a": ["b", "c", 12, []], "D": "foo"} def xmlrpc_dict(self, request, map, key): return map[key] def getFunction(self, functionPath): try: return XMLRPC.getFunction(self, functionPath) except xmlrpc.NoSuchFunction: if functionPath.startswith("SESSION"): raise xmlrpc.Fault(self.SESSION_EXPIRED, "Session non-existant/expired.") else: raise xmlrpc_dict.help = 'Help for dict.' class XMLRPCServerBase(BaseCase): """ The parent class of the XML-RPC test classes. """ method = 'POST' version = (1, 1) def setUp(self): self.root = XMLRPCTestResource() self.xml = ("\n\n" + "%s\n") class XMLRPCServerGETTest(XMLRPCServerBase): """ Attempt access to the RPC resources as regular HTTP resource. """ def setUp(self): super(XMLRPCServerGETTest, self).setUp() self.method = 'GET' self.errorRPC = ('XML-RPC responder' + '

XML-RPC responder

POST your XML-RPC ' + 'here.') self.errorHTTP = ('404 Not Found' + '

Not Found

The resource http://host/add ' + 'cannot be found.') def test_rootGET(self): """ Test a simple GET against the XML-RPC server. """ return self.assertResponse( (self.root, 'http://host/'), (200, {}, self.errorRPC)) def test_childGET(self): """ Try to access an XML-RPC method as a regular resource via GET. """ return self.assertResponse( (self.root, 'http://host/add'), (404, {}, self.errorHTTP)) class XMLRPCServerPOSTTest(XMLRPCServerBase): """ Tests for standard XML-RPC usage. """ def test_RPCMethods(self): """ Make RPC calls of the defined methods, checking for the expected results. """ inputOutput = [ ("add", (2, 3), 5), ("defer", ("a",), "a"), ("dict", ({"a": 1}, "a"), 1), ("pair", ("a", 1), ["a", 1]), ("complex", (), {"a": ["b", "c", 12, []], "D": "foo"})] dl = [] for meth, args, outp in inputOutput: postdata = xmlrpclib.dumps(args, meth) respdata = xmlrpclib.dumps((outp,)) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) d = self.assertResponse(reqdata, (200, {}, self.xml % respdata)) dl.append(d) return defer.DeferredList(dl, fireOnOneErrback=True) def test_RPCFaults(self): """ Ensure that RPC faults are properly processed. """ dl = [] codeMethod = [ (12, "fault", 'hello'), (23, "noSuchMethod", 'function noSuchMethod not found'), (17, "deferFault", 'hi'), (42, "SESSION_TEST", 'Session non-existant/expired.')] for code, meth, fault in codeMethod: postdata = xmlrpclib.dumps((), meth) respdata = xmlrpclib.dumps(xmlrpc.Fault(code, fault)) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) d = self.assertResponse(reqdata, (200, {}, respdata)) dl.append(d) d = defer.DeferredList(dl, fireOnOneErrback=True) return d def test_RPCFailures(self): """ Ensure that failures behave as expected. """ dl = [] codeMethod = [ (666, "fail"), (666, "deferFail")] for code, meth in codeMethod: postdata = xmlrpclib.dumps((), meth) respdata = xmlrpclib.dumps(xmlrpc.Fault(code, 'error')) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) d = self.assertResponse(reqdata, (200, {}, respdata)) d.addCallback(self.flushLoggedErrors, TestRuntimeError, TestValueError) dl.append(d) d = defer.DeferredList(dl, fireOnOneErrback=True) return d class XMLRPCTestIntrospection(XMLRPCServerBase): def setUp(self): """ Introspection requires additional setup, most importantly, adding introspection to the root object. """ super(XMLRPCTestIntrospection, self).setUp() addIntrospection(self.root) self.methodList = ['add', 'complex', 'defer', 'deferFail', 'deferFault', 'dict', 'fail', 'fault', 'pair', 'system.listMethods', 'system.methodHelp', 'system.methodSignature'] def test_listMethods(self): """ Check that the introspection method "listMethods" returns all the methods we defined in the XML-RPC server. """ def cbMethods(meths): meths.sort() self.failUnlessEqual( meths, ) postdata = xmlrpclib.dumps((), 'system.listMethods') respdata = xmlrpclib.dumps((self.methodList,)) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) return self.assertResponse(reqdata, (200, {}, self.xml % respdata)) def test_methodHelp(self): """ Check the RPC methods for docstrings or .help attributes. """ inputOutput = [ ("defer", "Help for defer."), ("fail", ""), ("dict", "Help for dict.")] dl = [] for meth, outp in inputOutput: postdata = xmlrpclib.dumps((meth,), 'system.methodHelp') respdata = xmlrpclib.dumps((outp,)) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) d = self.assertResponse(reqdata, (200, {}, self.xml % respdata)) dl.append(d) return defer.DeferredList(dl, fireOnOneErrback=True) def test_methodSignature(self): """ Check that the RPC methods whose signatures have been set via the .signature attribute (on the method) are returned as expected. """ inputOutput = [ ("defer", ""), ("add", [['int', 'int', 'int'], ['double', 'double', 'double']]), ("pair", [['array', 'string', 'int']])] dl = [] for meth, outp in inputOutput: postdata = xmlrpclib.dumps((meth,), 'system.methodSignature') respdata = xmlrpclib.dumps((outp,)) reqdata = (self.root, 'http://host/', {}, None, None, '', postdata) d = self.assertResponse(reqdata, (200, {}, self.xml % respdata)) dl.append(d) return defer.DeferredList(dl, fireOnOneErrback=True) TwistedWeb2-8.1.0/twisted/web2/test/test_server.py0000644000175000017500000007100110713026424020553 0ustar dokodoko# Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ A test harness for the twisted.web2 server. """ from zope.interface import implements from twisted.python import components from twisted.web2 import http, http_headers, iweb, server from twisted.web2 import resource, stream, compat from twisted.trial import unittest from twisted.internet import reactor, defer, address class NotResource(object): """ Class which does not implement IResource. Used as an adaptee by L{AdaptionTestCase.test_registered} to test that if an object which does not provide IResource is adapted to IResource and there is an adapter to IResource registered, that adapter is used. """ class ResourceAdapter(object): """ Adapter to IResource. Registered as an adapter from NotResource to IResource so that L{AdaptionTestCase.test_registered} can test that such an adapter will be used. """ implements(iweb.IResource) def __init__(self, original): pass components.registerAdapter(ResourceAdapter, NotResource, iweb.IResource) class NotOldResource(object): """ Class which does not implement IOldNevowResource or IResource. Used as an adaptee by L{AdaptionTestCase.test_transitive} to test that if an object which does not provide IResource or IOldNevowResource is adapted to IResource and there is an adapter to IOldNevowResource registered, first that adapter is used, then the included adapter from IOldNevowResource to IResource is used. """ class OldResourceAdapter(object): """ Adapter to IOldNevowResource. Registered as an adapter from NotOldResource to IOldNevowResource so that L{AdaptionTestCase.test_transitive} can test that such an adapter will be used to allow the initial input to be adapted to IResource. """ implements(iweb.IOldNevowResource) def __init__(self, original): pass components.registerAdapter(OldResourceAdapter, NotOldResource, iweb.IOldNevowResource) class AdaptionTestCase(unittest.TestCase): """ Test the adaption of various objects to IResource. Necessary due to the special implementation of __call__ on IResource which extends the behavior provided by the base Interface.__call__. """ def test_unadaptable(self): """ Test that attempting to adapt to IResource an object not adaptable to IResource raises an exception or returns the specified alternate object. """ class Unadaptable(object): pass self.assertRaises(TypeError, iweb.IResource, Unadaptable()) alternate = object() self.assertIdentical(iweb.IResource(Unadaptable(), alternate), alternate) def test_redundant(self): """ Test that the adaption to IResource of an object which provides IResource returns the same object. """ class Resource(object): implements(iweb.IResource) resource = Resource() self.assertIdentical(iweb.IResource(resource), resource) def test_registered(self): """ Test that if an adapter exists which can provide IResource for an object which does not provide it, that adapter is used. """ notResource = NotResource() self.failUnless(isinstance(iweb.IResource(notResource), ResourceAdapter)) def test_oldResources(self): """ Test that providers of L{IOldNevowResource} can be adapted to IResource automatically. """ class OldResource(object): implements(iweb.IOldNevowResource) oldResource = OldResource() resource = iweb.IResource(oldResource) self.failUnless(isinstance(resource, compat.OldNevowResourceAdapter)) def test_transitive(self): """ Test that a special-case transitive adaption from something to IOldNevowResource to IResource is possible. """ notResource = NotOldResource() resource = iweb.IResource(notResource) self.failUnless(isinstance(resource, compat.OldNevowResourceAdapter)) class SimpleRequest(server.Request): """I can be used in cases where a Request object is necessary but it is benificial to bypass the chanRequest """ clientproto = (1,1) def __init__(self, site, method, uri, headers=None, content=None): if not headers: headers = http_headers.Headers(headers) super(SimpleRequest, self).__init__( site=site, chanRequest=None, command=method, path=uri, version=self.clientproto, contentLength=len(content or ''), headers=headers) self.stream = stream.MemoryStream(content or '') self.remoteAddr = address.IPv4Address('TCP', '127.0.0.1', 0) self._parseURL() self.host = 'localhost' self.port = 8080 def writeResponse(self, response): return response class TestChanRequest: implements(iweb.IChanRequest) hostInfo = address.IPv4Address('TCP', 'host', 80), False remoteHost = address.IPv4Address('TCP', 'remotehost', 34567) def __init__(self, site, method, prepath, uri, length=None, headers=None, version=(1,1), content=None): self.site = site self.method = method self.prepath = prepath self.uri = uri if headers is None: headers = http_headers.Headers() self.headers = headers self.http_version = version # Anything below here we do not pass as arguments self.request = server.Request(self, self.method, self.uri, self.http_version, length, self.headers, site=self.site, prepathuri=self.prepath) if content is not None: self.request.handleContentChunk(content) self.request.handleContentComplete() self.code = None self.responseHeaders = None self.data = '' self.deferredFinish = defer.Deferred() def writeIntermediateResponse(code, headers=None): pass def writeHeaders(self, code, headers): self.responseHeaders = headers self.code = code def write(self, data): self.data += data def finish(self, failed=False): result = self.code, self.responseHeaders, self.data, failed self.finished = True self.deferredFinish.callback(result) def abortConnection(self): self.finish(failed=True) def registerProducer(self, producer, streaming): pass def unregisterProducer(self): pass def getHostInfo(self): return self.hostInfo def getRemoteHost(self): return self.remoteHost class BaseTestResource(resource.Resource): responseCode = 200 responseText = 'This is a fake resource.' responseHeaders = {} addSlash = False def __init__(self, children=[]): """ @type children: C{list} of C{tuple} @param children: a list of ('path', resource) tuples """ for i in children: self.putChild(i[0], i[1]) def render(self, req): return http.Response(self.responseCode, headers=self.responseHeaders, stream=self.responseStream()) def responseStream(self): return stream.MemoryStream(self.responseText) _unset = object() class BaseCase(unittest.TestCase): """ Base class for test cases that involve testing the result of arbitrary HTTP(S) queries. """ method = 'GET' version = (1, 1) wait_timeout = 5.0 def chanrequest(self, root, uri, length, headers, method, version, prepath, content): site = server.Site(root) return TestChanRequest(site, method, prepath, uri, length, headers, version, content) def getResponseFor(self, root, uri, headers={}, method=None, version=None, prepath='', content=None, length=_unset): if not isinstance(headers, http_headers.Headers): headers = http_headers.Headers(headers) if length is _unset: if content is not None: length = len(content) else: length = 0 if method is None: method = self.method if version is None: version = self.version cr = self.chanrequest(root, uri, length, headers, method, version, prepath, content) cr.request.process() return cr.deferredFinish def assertResponse(self, request_data, expected_response, failure=False): """ @type request_data: C{tuple} @type expected_response: C{tuple} @param request_data: A tuple of arguments to pass to L{getResponseFor}: (root, uri, headers, method, version, prepath). Root resource and requested URI are required, and everything else is optional. @param expected_response: A 3-tuple of the expected response: (responseCode, headers, htmlData) """ d = self.getResponseFor(*request_data) d.addCallback(self._cbGotResponse, expected_response, failure) return d def _cbGotResponse(self, (code, headers, data, failed), expected_response, expectedfailure=False): expected_code, expected_headers, expected_data = expected_response self.assertEquals(code, expected_code) if expected_data is not None: self.assertEquals(data, expected_data) for key, value in expected_headers.iteritems(): self.assertEquals(headers.getHeader(key), value) self.assertEquals(failed, expectedfailure) class SampleWebTest(BaseCase): class SampleTestResource(BaseTestResource): addSlash = True def child_validChild(self, req): f = BaseTestResource() f.responseCode = 200 f.responseText = 'This is a valid child resource.' return f def child_missingChild(self, req): f = BaseTestResource() f.responseCode = 404 f.responseStream = lambda self: None return f def child_remoteAddr(self, req): f = BaseTestResource() f.responseCode = 200 f.responseText = 'Remote Addr: %r' % req.remoteAddr.host return f def setUp(self): self.root = self.SampleTestResource() def test_root(self): return self.assertResponse( (self.root, 'http://host/'), (200, {}, 'This is a fake resource.')) def test_validChild(self): return self.assertResponse( (self.root, 'http://host/validChild'), (200, {}, 'This is a valid child resource.')) def test_invalidChild(self): return self.assertResponse( (self.root, 'http://host/invalidChild'), (404, {}, None)) def test_remoteAddrExposure(self): return self.assertResponse( (self.root, 'http://host/remoteAddr'), (200, {}, "Remote Addr: 'remotehost'")) def test_leafresource(self): class TestResource(resource.LeafResource): def render(self, req): return http.Response(stream="prepath:%s postpath:%s" % ( req.prepath, req.postpath)) return self.assertResponse( (TestResource(), 'http://host/consumed/path/segments'), (200, {}, "prepath:[] postpath:['consumed', 'path', 'segments']")) def test_redirectResource(self): redirectResource = resource.RedirectResource(scheme='https', host='localhost', port=443, path='/foo', querystring='bar=baz') return self.assertResponse( (redirectResource, 'http://localhost/'), (301, {'location': 'https://localhost/foo?bar=baz'}, None)) class URLParsingTest(BaseCase): class TestResource(resource.LeafResource): def render(self, req): return http.Response(stream="Host:%s, Path:%s"%(req.host, req.path)) def setUp(self): self.root = self.TestResource() def test_normal(self): return self.assertResponse( (self.root, '/path', {'Host':'host'}), (200, {}, 'Host:host, Path:/path')) def test_fullurl(self): return self.assertResponse( (self.root, 'http://host/path'), (200, {}, 'Host:host, Path:/path')) def test_strangepath(self): # Ensure that the double slashes don't confuse it return self.assertResponse( (self.root, '//path', {'Host':'host'}), (200, {}, 'Host:host, Path://path')) def test_strangepathfull(self): return self.assertResponse( (self.root, 'http://host//path'), (200, {}, 'Host:host, Path://path')) class TestDeferredRendering(BaseCase): class ResourceWithDeferreds(BaseTestResource): addSlash=True responseText = 'I should be wrapped in a Deferred.' def render(self, req): d = defer.Deferred() reactor.callLater( 0, d.callback, BaseTestResource.render(self, req)) return d def child_deferred(self, req): d = defer.Deferred() reactor.callLater(0, d.callback, BaseTestResource()) return d def test_deferredRootResource(self): return self.assertResponse( (self.ResourceWithDeferreds(), 'http://host/'), (200, {}, 'I should be wrapped in a Deferred.')) def test_deferredChild(self): return self.assertResponse( (self.ResourceWithDeferreds(), 'http://host/deferred'), (200, {}, 'This is a fake resource.')) class RedirectResourceTest(BaseCase): def html(url): return "Moved Permanently

Moved Permanently

Document moved to %s.

" % (url,) html = staticmethod(html) def test_noRedirect(self): # This is useless, since it's a loop, but hey ds = [] for url in ("http://host/", "http://host/foo"): ds.append(self.assertResponse( (resource.RedirectResource(), url), (301, {"location": url}, self.html(url)) )) return defer.DeferredList(ds, fireOnOneErrback=True) def test_hostRedirect(self): ds = [] for url1, url2 in ( ("http://host/", "http://other/"), ("http://host/foo", "http://other/foo"), ): ds.append(self.assertResponse( (resource.RedirectResource(host="other"), url1), (301, {"location": url2}, self.html(url2)) )) return defer.DeferredList(ds, fireOnOneErrback=True) def test_pathRedirect(self): root = BaseTestResource() redirect = resource.RedirectResource(path="/other") root.putChild("r", redirect) ds = [] for url1, url2 in ( ("http://host/r", "http://host/other"), ("http://host/r/foo", "http://host/other"), ): ds.append(self.assertResponse( (resource.RedirectResource(path="/other"), url1), (301, {"location": url2}, self.html(url2)) )) return defer.DeferredList(ds, fireOnOneErrback=True) class EmptyResource(resource.Resource): def __init__(self, test): self.test = test def render(self, request): self.test.assertEquals(request.urlForResource(self), self.expectedURI) return 201 class RememberURIs(BaseCase): """ Tests for URI memory and lookup mechanism in server.Request. """ def test_requestedResource(self): """ Test urlForResource() on deeply nested resource looked up via request processing. """ root = EmptyResource(self) root.expectedURI = "/" foo = EmptyResource(self) foo.expectedURI = "/foo" root.putChild("foo", foo) bar = EmptyResource(self) bar.expectedURI = foo.expectedURI + "/bar" foo.putChild("bar", bar) baz = EmptyResource(self) baz.expectedURI = bar.expectedURI + "/baz" bar.putChild("baz", baz) ds = [] for uri in (foo.expectedURI, bar.expectedURI, baz.expectedURI): ds.append(self.assertResponse( (root, uri, {'Host':'host'}), (201, {}, None), )) return defer.DeferredList(ds, fireOnOneErrback=True) def test_urlEncoding(self): """ Test to make sure that URL encoding is working. """ root = EmptyResource(self) root.expectedURI = "/" child = EmptyResource(self) child.expectedURI = "/foo%20bar" root.putChild("foo bar", child) return self.assertResponse( (root, child.expectedURI, {'Host':'host'}), (201, {}, None) ) def test_locateResource(self): """ Test urlForResource() on resource looked up via a locateResource() call. """ root = resource.Resource() child = resource.Resource() root.putChild("foo", child) request = SimpleRequest(server.Site(root), "GET", "/") def gotResource(resource): self.assertEquals("/foo", request.urlForResource(resource)) d = defer.maybeDeferred(request.locateResource, "/foo") d.addCallback(gotResource) return d def test_unknownResource(self): """ Test urlForResource() on unknown resource. """ root = resource.Resource() child = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/") self.assertRaises(server.NoURLForResourceError, request.urlForResource, child) def test_locateChildResource(self): """ Test urlForResource() on deeply nested resource looked up via locateChildResource(). """ root = EmptyResource(self) root.expectedURI = "/" foo = EmptyResource(self) foo.expectedURI = "/foo" root.putChild("foo", foo) bar = EmptyResource(self) bar.expectedURI = "/foo/bar" foo.putChild("bar", bar) baz = EmptyResource(self) baz.expectedURI = "/foo/bar/b%20a%20z" bar.putChild("b a z", baz) request = SimpleRequest(server.Site(root), "GET", "/") def gotResource(resource): # Make sure locateChildResource() gave us the right answer self.assertEquals(resource, bar) return request.locateChildResource(resource, "b a z").addCallback(gotChildResource) def gotChildResource(resource): # Make sure locateChildResource() gave us the right answer self.assertEquals(resource, baz) self.assertEquals(resource.expectedURI, request.urlForResource(resource)) d = request.locateResource(bar.expectedURI) d.addCallback(gotResource) return d def test_deferredLocateChild(self): """ Test deferred value from locateChild() """ class DeferredLocateChild(resource.Resource): def locateChild(self, req, segments): return defer.maybeDeferred( super(DeferredLocateChild, self).locateChild, req, segments ) root = DeferredLocateChild() child = resource.Resource() root.putChild("foo", child) request = SimpleRequest(server.Site(root), "GET", "/foo") def gotResource(resource): self.assertEquals("/foo", request.urlForResource(resource)) d = request.locateResource("/foo") d.addCallback(gotResource) return d class ParsePostDataTests(unittest.TestCase): """ Tests for L{server.parsePOSTData}. """ def test_noData(self): """ Parsing a request without data should succeed but should not fill the C{args} and C{files} attributes of the request. """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/") def cb(ign): self.assertEquals(request.args, {}) self.assertEquals(request.files, {}) return server.parsePOSTData(request).addCallback(cb) def test_noContentType(self): """ Parsing a request without content-type should succeed but should not fill the C{args} and C{files} attributes of the request. """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", content="foo") def cb(ign): self.assertEquals(request.args, {}) self.assertEquals(request.files, {}) return server.parsePOSTData(request).addCallback(cb) def test_urlencoded(self): """ Test parsing data in urlencoded format: it should end in the C{args} attribute. """ ctype = http_headers.MimeType('application', 'x-www-form-urlencoded') content = "key=value&multiple=two+words&multiple=more%20words" root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) def cb(ign): self.assertEquals(request.files, {}) self.assertEquals(request.args, {'multiple': ['two words', 'more words'], 'key': ['value']}) return server.parsePOSTData(request).addCallback(cb) def test_multipart(self): """ Test parsing data in multipart format: it should fill the C{files} attribute. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"; filename="myfilename"\r Content-Type: text/html\r \r my great content wooo\r -----weeboundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) def cb(ign): self.assertEquals(request.args, {}) self.assertEquals(request.files.keys(), ['FileNameOne']) self.assertEquals(request.files.values()[0][0][:2], ('myfilename', http_headers.MimeType('text', 'html', {}))) f = request.files.values()[0][0][2] self.assertEquals(f.read(), "my great content wooo") return server.parsePOSTData(request).addCallback(cb) def test_multipartWithNoBoundary(self): """ If the boundary type is not specified, parsing should fail with a C{http.HTTPError}. """ ctype = http_headers.MimeType('multipart', 'form-data') content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"; filename="myfilename"\r Content-Type: text/html\r \r my great content wooo\r -----weeboundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) return self.assertFailure(server.parsePOSTData(request), http.HTTPError) def test_wrongContentType(self): """ Check that a content-type not handled raise a C{http.HTTPError}. """ ctype = http_headers.MimeType('application', 'foobar') content = "key=value&multiple=two+words&multiple=more%20words" root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) return self.assertFailure(server.parsePOSTData(request), http.HTTPError) def test_mimeParsingError(self): """ A malformed content should result in a C{http.HTTPError}. The tested content has an invalid closing boundary. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"; filename="myfilename"\r Content-Type: text/html\r \r my great content wooo\r -----weeoundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) return self.assertFailure(server.parsePOSTData(request), http.HTTPError) def test_multipartMaxMem(self): """ Check that the C{maxMem} parameter makes the parsing raise an exception if the value is reached. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"\r Content-Type: text/html\r \r my great content wooo and even more and more\r -----weeboundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) def cb(res): self.assertEquals(res.response.description, "Maximum length of 10 bytes exceeded.") return self.assertFailure(server.parsePOSTData(request, maxMem=10), http.HTTPError).addCallback(cb) def test_multipartMaxSize(self): """ Check that the C{maxSize} parameter makes the parsing raise an exception if the data is too big. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"; filename="myfilename"\r Content-Type: text/html\r \r my great content wooo and even more and more\r -----weeboundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) def cb(res): self.assertEquals(res.response.description, "Maximum length of 10 bytes exceeded.") return self.assertFailure(server.parsePOSTData(request, maxSize=10), http.HTTPError).addCallback(cb) def test_maxFields(self): """ Check that the C{maxSize} parameter makes the parsing raise an exception if the data contains too many fields. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---xyz'),)) content = """-----xyz\r Content-Disposition: form-data; name="foo"\r \r Foo Bar\r -----xyz\r Content-Disposition: form-data; name="foo"\r \r Baz\r -----xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r blah\r -----xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/plain\r \r bleh\r -----xyz--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) def cb(res): self.assertEquals(res.response.description, "Maximum number of fields 3 exceeded") return self.assertFailure(server.parsePOSTData(request, maxFields=3), http.HTTPError).addCallback(cb) def test_otherErrors(self): """ Test that errors durign parsing other than C{MimeFormatError} are propagated. """ ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) # XXX: maybe this is not a good example # parseContentDispositionFormData could handle this problem content="""-----weeboundary\r Content-Disposition: form-data; name="FileNameOne"; filename="myfilename and invalid data \r -----weeboundary--\r """ root = resource.Resource() request = SimpleRequest(server.Site(root), "GET", "/", http_headers.Headers({'content-type': ctype}), content) return self.assertFailure(server.parsePOSTData(request), ValueError) TwistedWeb2-8.1.0/twisted/web2/test/test_wsgi.py0000644000175000017500000002714410634575711020241 0ustar dokodoko# Copyright (c) 2005-2007 Twisted Matrix Laboratories. # See LICENSE for details. import time from twisted.web2.test.test_server import BaseCase from twisted.internet import reactor, interfaces, defer from twisted.python import log if interfaces.IReactorThreads(reactor, None) is not None: from twisted.web2.wsgi import WSGIResource as WSGI else: WSGI = None class TestError(Exception): pass class TestContainer(BaseCase): """ Tests various applications with the WSGI container. """ def flushErrors(self, result, error): """ Flush the specified C{error] and forward C{result}. """ self.flushLoggedErrors(error) return result def test_getContainedResource(self): """ Test that non-blocking WSGI applications render properly. """ def application(environ, start_response): status = '200 OK' response_headers = [('Content-type','text/html')] writer = start_response(status, response_headers) writer('') return ['

Some HTML

', ''] return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": None}, '

Some HTML

')) def test_getBlockingResource(self): """ Test that blocking WSGI applications render properly. """ def application(environ, start_response): """ Simplest possible application object. """ status = '200 OK' response_headers = [('Content-type','text/html')] writer = start_response(status, response_headers) writer('

A little bit') time.sleep(1) writer(' of HTML

') time.sleep(1) return ['

Hello!

'] return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": None}, '

A little bit of HTML

Hello!

')) def test_responseCode(self): """ Test that WSGIResource handles strange response codes properly. """ def application(environ, start_response): status = '314' response_headers = [('Content-type','text/html')] writer = start_response(status, response_headers) return [] return self.assertResponse( (WSGI(application), 'http://host/'), (314, {"Content-Length": 0}, '')) def test_errorfulResource(self): def application(environ, start_response): raise TestError("This is an expected error") return self.assertResponse( (WSGI(application), 'http://host/'), (500, {}, None)).addBoth(self.flushErrors, TestError) def test_errorfulResource2(self): def application(environ, start_response): write = start_response("200 OK", {}) write("Foo") raise TestError("This is an expected error") return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": None}, "Foo"), failure=True ).addBoth(self.flushErrors, TestError) def test_errorfulIterator(self): def iterator(): raise TestError("This is an expected error") def application(environ, start_response): start_response("200 OK", {}) return iterator() return self.assertResponse( (WSGI(application), 'http://host/'), (500, {}, None)).addBoth(self.flushErrors, TestError) def test_errorfulIterator2(self): def iterator(): yield "Foo" yield "Bar" raise TestError("This is also expected") def application(environ, start_response): start_response("200 OK", {}) return iterator() return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": None}, "FooBar"), failure=True ).addBoth(self.flushErrors, TestError) def test_didntCallStartResponse(self): def application(environ, start_response): return ["Foo"] return self.assertResponse( (WSGI(application), 'http://host/'), (500, {}, None)).addBoth(self.flushErrors, RuntimeError) def test_calledStartResponseLate(self): def application(environ, start_response): start_response("200 OK", {}) yield "Foo" return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": None}, "Foo")) def test_returnList(self): def application(environ, start_response): write = start_response("200 OK", {}) return ["Foo", "Bar"] return self.assertResponse( (WSGI(application), 'http://host/'), (200, {"Content-Length": 6}, "FooBar")) def test_readAllInput(self): def application(environ, start_response): input = environ['wsgi.input'] out = input.read(-1) start_response("200 OK", {}) return [out] return self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "This is some content"), (200, {"Content-Length": 20}, "This is some content")) def test_readInputLines(self): def application(environ, start_response): input = environ['wsgi.input'] out = 'X'.join(input.readlines()) start_response("200 OK", {}) return [out] d = self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "a\nb\nc"), (200, {"Content-Length": 7}, "a\nXb\nXc")) d.addCallback(lambda d: self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "a\nb\n"), (200, {"Content-Length": 5}, "a\nXb\n"))) return d def test_readInputLineSizeNegZero(self): """ Test that calling wsgi.input.readline works with -1 and 0 and none. """ def application(environ, start_response): input = environ['wsgi.input'] out = [input.read(5)] # 'Line ' out.extend(["X", input.readline(-1)]) # 'blah blah\n' out.extend(["X", input.readline(0)]) # '' out.extend(["X", input.readline(None)]) # 'Oh Line\n' out.extend(["X", input.readline()]) # '' start_response("200 OK", {}) return out return self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "Line blah blah\nOh Line\n"), (200, {"Content-Length": 27}, "Line Xblah blah\nXXOh Line\nX")) def test_readInputLineSize(self): """ Test that readline() with a size works. """ def application(environ, start_response): input = environ['wsgi.input'] out = [input.read(5)] # 'Line ' out.extend(["X", input.readline(5)]) # 'blah ' out.extend(["X", input.readline()]) # 'blah\n' out.extend(["X", input.readline(1)]) # 'O' out.extend(["X", input.readline()]) # 'h Line\n' start_response("200 OK", {}) return out return self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "Line blah blah\nOh Line\n"), (200, {"Content-Length": 27}, "Line Xblah Xblah\nXOXh Line\n")) def test_readInputMixed(self): def application(environ, start_response): input = environ['wsgi.input'] out = [input.read(5)] out.extend(["X", input.readline()]) out.extend(["X", input.read(1)]) out.extend(["X", input.readline()]) start_response("200 OK", {}) return out return self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "Line blah blah\nOh Line\n"), (200, {"Content-Length": 26}, "Line Xblah blah\nXOXh Line\n")) def test_readiter(self): """ Test that using wsgi.input as an iterator works. """ def application(environ, start_response): input = environ['wsgi.input'] out = 'X'.join(input) start_response("200 OK", {}) return [out] return self.assertResponse( (WSGI(application), 'http://host/', {}, None, None, '', "Line blah blah\nOh Line\n"), (200, {"Content-Length": 24}, "Line blah blah\nXOh Line\n")) class TestWSGIEnvironment(BaseCase): """ Test that the WSGI container does everything we expect it to do with the WSGI environment dictionary. """ def envApp(self, *varnames): """ Return a WSGI application that writes environment variables. """ def _app(environ, start_response): status = '200' response_headers = [('Content-type','text/html')] writer = start_response(status, response_headers) return ['%s=%r;' % (k, environ.get(k, '')) for k in varnames] return _app def assertEnv(self, uri, env, version=None, prepath=''): """ Check the value of the rendering envirnment against the string returned by the testing WSGIApp. """ keys = env.keys() keys.sort() envstring = ''.join(['%s=%r;' % (k, v) for k, v in env.items()]) return self.assertResponse( (WSGI(self.envApp(*keys)), uri, None, None, version, prepath), (200, {}, envstring)) def test_wsgi_url_scheme(self): """ Check the value of C{wsgi.url_scheme} variable for the http and the https cases. """ return defer.gatherResults([ self.assertEnv('https://host/', {'wsgi.url_scheme': 'https'}), self.assertEnv('http://host/', {'wsgi.url_scheme': 'http'}) ]) def test_SERVER_PROTOCOL(self): """ Check the value the C{SERVER_PROTOCOL} variable in the WSGI environment. """ return self.assertEnv('http://host/', {'SERVER_PROTOCOL': 'HTTP/1.1'}) def test_SERVER_PORT(self): """ Check the value of the C{SERVER_PORT} variable in the WSGI environment, for different kind of URLs. """ return defer.gatherResults([ self.assertEnv('http://host/', {'SERVER_PORT': '80'}), self.assertEnv('http://host:523/', {'SERVER_PORT': '523'}), self.assertEnv('https://host/', {'SERVER_PORT': '443'}), self.assertEnv('https://host:523/', {'SERVER_PORT': '523'}), self.assertEnv('/foo', {'SERVER_PORT': '80'}, version=(1,0)) ]) def test_SCRIPT_NAME(self): """ Check the value of C{SCRIPT_NAME}, depending of the prepath field. """ return defer.gatherResults([ self.assertEnv('http://host/', {'SCRIPT_NAME': ''}), self.assertEnv('http://host/myscript/foobar', {'SCRIPT_NAME': '/myscript', 'PATH_INFO': '/foobar'}, prepath='/myscript'), self.assertEnv('http://host/myscript/foobar/', {'SCRIPT_NAME': '/myscript/foobar', 'PATH_INFO': '/'}, prepath='/myscript/foobar'), self.assertEnv('http://host/myscript/foobar?bar=baz', {'SCRIPT_NAME': '/myscript', 'PATH_INFO': '/foobar'}, prepath='/myscript') ]) if WSGI is None: for cls in (TestContainer, TestWSGIEnvironment): setattr(cls, 'skip', 'Required thread support is missing, skipping') TwistedWeb2-8.1.0/twisted/web2/test/test_fileupload.py0000644000175000017500000002100610713026424021371 0ustar dokodoko# Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ Tests for L{twisted.web2.fileupload} and its different parsing functions. """ from twisted.internet import defer from twisted.trial import unittest from twisted.internet.defer import waitForDeferred, deferredGenerator from twisted.web2 import stream, fileupload from twisted.web2.http_headers import MimeType class TestStream(stream.SimpleStream): """ A stream that reads less data at a time than it could. """ def __init__(self, mem, maxReturn=1000, start=0, length=None): self.mem = mem self.start = start self.maxReturn = maxReturn if length is None: self.length = len(mem) - start else: if len(mem) < length: raise ValueError("len(mem) < start + length") self.length = length def read(self): if self.mem is None: return None if self.length == 0: result = None else: amtToRead = min(self.maxReturn, self.length) result = buffer(self.mem, self.start, amtToRead) self.length -= amtToRead self.start += amtToRead return result def close(self): self.mem = None stream.SimpleStream.close(self) class MultipartTests(unittest.TestCase): def doTestError(self, boundary, data, expected_error): # Test different amounts of data at a time. ds = [fileupload.parseMultipartFormData(TestStream(data, maxReturn=bytes), boundary) for bytes in range(1, 20)] d = defer.DeferredList(ds, consumeErrors=True) d.addCallback(self._assertFailures, expected_error) return d def _assertFailures(self, failures, *expectedFailures): for flag, failure in failures: self.failUnlessEqual(flag, defer.FAILURE) failure.trap(*expectedFailures) def doTest(self, boundary, data, expected_args, expected_files): #import time, gc, cgi, cStringIO for bytes in range(1, 20): #s = TestStream(data, maxReturn=bytes) s = stream.IStream(data) #t=time.time() d = waitForDeferred(fileupload.parseMultipartFormData(s, boundary)) yield d; args, files = d.getResult() #e=time.time() #print "%.2g"%(e-t) self.assertEquals(args, expected_args) # Read file data back into memory to compare. out = {} for name, l in files.items(): out[name] = [(filename, ctype, f.read()) for (filename, ctype, f) in l] self.assertEquals(out, expected_files) #data=cStringIO.StringIO(data) #t=time.time() #d=cgi.parse_multipart(data, {'boundary':boundary}) #e=time.time() #print "CGI: %.2g"%(e-t) doTest = deferredGenerator(doTest) def testNormalUpload(self): return self.doTest( '---------------------------155781040421463194511908194298', """-----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="foo"\r \r Foo Bar\r -----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r Contents of a file blah blah\r -----------------------------155781040421463194511908194298--\r """, {'foo':['Foo Bar']}, {'file':[('filename', MimeType('text', 'html'), "Contents of a file\nblah\nblah")]}) def testMultipleUpload(self): return self.doTest( 'xyz', """--xyz\r Content-Disposition: form-data; name="foo"\r \r Foo Bar\r --xyz\r Content-Disposition: form-data; name="foo"\r \r Baz\r --xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r blah\r --xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/plain\r \r bleh\r --xyz--\r """, {'foo':['Foo Bar', 'Baz']}, {'file':[('filename', MimeType('text', 'html'), "blah"), ('filename', MimeType('text', 'plain'), "bleh")]}) def testStupidFilename(self): return self.doTest( '----------0xKhTmLbOuNdArY', """------------0xKhTmLbOuNdArY\r Content-Disposition: form-data; name="file"; filename="foo"; name="foobar.txt"\r Content-Type: text/plain\r \r Contents of a file blah blah\r ------------0xKhTmLbOuNdArY--\r """, {}, {'file':[('foo"; name="foobar.txt', MimeType('text', 'plain'), "Contents of a file\nblah\nblah")]}) def testEmptyFilename(self): return self.doTest( 'curlPYafCMnsamUw9kSkJJkSen41sAV', """--curlPYafCMnsamUw9kSkJJkSen41sAV\r cONTENT-tYPE: application/octet-stream\r cONTENT-dISPOSITION: FORM-DATA; NAME="foo"; FILENAME=""\r \r qwertyuiop\r --curlPYafCMnsamUw9kSkJJkSen41sAV--\r """, {}, {'foo':[('', MimeType('application', 'octet-stream'), "qwertyuiop")]}) # Failing parses def testMissingContentDisposition(self): return self.doTestError( '----------0xKhTmLbOuNdArY', """------------0xKhTmLbOuNdArY\r Content-Type: text/html\r \r Blah blah I am a stupid webbrowser\r ------------0xKhTmLbOuNdArY--\r """, fileupload.MimeFormatError) def testRandomData(self): return self.doTestError( 'boundary', """--sdkjsadjlfjlj skjsfdkljsd sfdkjsfdlkjhsfadklj sffkj""", fileupload.MimeFormatError) def test_tooBigUpload(self): """ Test that a too big form post fails. """ boundary = '---------------------------155781040421463194511908194298' data = """-----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="foo"\r \r Foo Bar\r -----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r Contents of a file blah blah\r -----------------------------155781040421463194511908194298--\r """ s = stream.IStream(data) return self.assertFailure( fileupload.parseMultipartFormData(s, boundary, maxSize=200), fileupload.MimeFormatError) def test_tooManyFields(self): """ Test when breaking the maximum number of fields. """ boundary = 'xyz' data = """--xyz\r Content-Disposition: form-data; name="foo"\r \r Foo Bar\r --xyz\r Content-Disposition: form-data; name="foo"\r \r Baz\r --xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r blah\r --xyz\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/plain\r \r bleh\r --xyz--\r """ s = stream.IStream(data) return self.assertFailure( fileupload.parseMultipartFormData(s, boundary, maxFields=3), fileupload.MimeFormatError) def test_maxMem(self): """ An attachment with no filename goes to memory: check that the C{maxMem} parameter limits the size of this kind of attachment. """ boundary = '---------------------------155781040421463194511908194298' data = """-----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="foo"\r \r Foo Bar and more content\r -----------------------------155781040421463194511908194298\r Content-Disposition: form-data; name="file"; filename="filename"\r Content-Type: text/html\r \r Contents of a file blah blah\r -----------------------------155781040421463194511908194298--\r """ s = stream.IStream(data) return self.assertFailure( fileupload.parseMultipartFormData(s, boundary, maxMem=10), fileupload.MimeFormatError) class TestURLEncoded(unittest.TestCase): def doTest(self, data, expected_args): for bytes in range(1, 20): s = TestStream(data, maxReturn=bytes) d = waitForDeferred(fileupload.parse_urlencoded(s)) yield d; args = d.getResult() self.assertEquals(args, expected_args) doTest = deferredGenerator(doTest) def test_parseValid(self): self.doTest("a=b&c=d&c=e", {'a':['b'], 'c':['d', 'e']}) self.doTest("a=b&c=d&c=e", {'a':['b'], 'c':['d', 'e']}) self.doTest("a=b+c%20d", {'a':['b c d']}) def test_parseInvalid(self): self.doTest("a&b=c", {'b':['c']}) TwistedWeb2-8.1.0/twisted/web2/test/stream_data.txt0000644000175000017500000000002510220162067020654 0ustar dokodokoWe've got some text! TwistedWeb2-8.1.0/twisted/web2/test/test_client.py0000644000175000017500000003555610706063764020554 0ustar dokodoko# Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ Tests for HTTP client. """ from twisted.internet import protocol, defer from twisted.web2.client import http from twisted.web2 import http_headers from twisted.web2 import stream from twisted.web2.test.test_http import LoopbackRelay, HTTPTests, TestConnection class TestServer(protocol.Protocol): data = "" done = False def dataReceived(self, data): self.data += data def write(self, data): self.transport.write(data) def connectionLost(self, reason): self.done = True self.transport.loseConnection() def loseConnection(self): self.done = True self.transport.loseConnection() class ClientTests(HTTPTests): def connect(self, logFile=None, maxPipeline=4, inputTimeOut=60000, betweenRequestsTimeOut=600000): cxn = TestConnection() cxn.client = http.HTTPClientProtocol() cxn.client.inputTimeOut = inputTimeOut cxn.server = TestServer() cxn.serverToClient = LoopbackRelay(cxn.client, logFile) cxn.clientToServer = LoopbackRelay(cxn.server, logFile) cxn.server.makeConnection(cxn.serverToClient) cxn.client.makeConnection(cxn.clientToServer) return cxn def writeToClient(self, cxn, data): cxn.server.write(data) self.iterate(cxn) def writeLines(self, cxn, lines): self.writeToClient(cxn, '\r\n'.join(lines)) def assertReceived(self, cxn, expectedStatus, expectedHeaders, expectedContent=None): self.iterate(cxn) headers, content = cxn.server.data.split('\r\n\r\n', 1) status, headers = headers.split('\r\n', 1) headers = headers.split('\r\n') # check status line self.assertEquals(status, expectedStatus) # check headers (header order isn't guraunteed so we use # self.assertIn for x in headers: self.assertIn(x, expectedHeaders) if not expectedContent: expectedContent = '' self.assertEquals(content, expectedContent) def assertDone(self, cxn): self.iterate(cxn) self.assertEquals(cxn.server.done, True, 'Connection not closed.') def assertHeaders(self, resp, expectedHeaders): headers = list(resp.headers.getAllRawHeaders()) headers.sort() self.assertEquals(headers, expectedHeaders) def checkResponse(self, resp, code, headers, length, data): """ Assert various things about a response: http code, headers, stream length, and data in stream. """ def gotData(gotdata): self.assertEquals(gotdata, data) self.assertEquals(resp.code, code) self.assertHeaders(resp, headers) self.assertEquals(resp.stream.length, length) return defer.maybeDeferred(resp.stream.read).addCallback(gotData) class TestHTTPClient(ClientTests): """ Test that the http client works. """ def test_simpleRequest(self): """ Your basic simple HTTP Request. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req).addCallback(self.checkResponse, 200, [], 10, '1234567890') self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 10', 'Connection: close', '', '1234567890')) return d.addCallback(lambda _: self.assertDone(cxn)) def test_delayedContent(self): """ Make sure that the client returns the response object as soon as the headers are received, even if the data hasn't arrived yet. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) def gotData(data): self.assertEquals(data, '1234567890') def gotResp(resp): self.assertEquals(resp.code, 200) self.assertHeaders(resp, []) self.assertEquals(resp.stream.length, 10) self.writeToClient(cxn, '1234567890') return defer.maybeDeferred(resp.stream.read).addCallback(gotData) d = cxn.client.submitRequest(req).addCallback(gotResp) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 10', 'Connection: close', '\r\n')) return d.addCallback(lambda _: self.assertDone(cxn)) def test_prematurePipelining(self): """ Ensure that submitting a second request before it's allowed results in an AssertionError. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) req2 = http.ClientRequest('GET', '/bar', None, None) d = cxn.client.submitRequest(req, closeAfter=False).addCallback( self.checkResponse, 200, [], 0, None) self.assertRaises(AssertionError, cxn.client.submitRequest, req2) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: Keep-Alive']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 0', 'Connection: close', '\r\n')) return d def test_userHeaders(self): """ Make sure that headers get through in both directions. """ cxn = self.connect(inputTimeOut=None) def submitNext(_): headers = http_headers.Headers( headers={'Accept-Language': {'en': 1.0}}, rawHeaders={'X-My-Other-Header': ['socks']}) req = http.ClientRequest('GET', '/', headers, None) cxn.server.data = '' d = cxn.client.submitRequest(req, closeAfter=True) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close', 'X-My-Other-Header: socks', 'Accept-Language: en']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 0', 'Connection: close', '\r\n')) return d req = http.ClientRequest('GET', '/', {'Accept-Language': {'en': 1.0}}, None) d = cxn.client.submitRequest(req, closeAfter=False).addCallback( self.checkResponse, 200, [('X-Foobar', ['Yes'])], 0, None).addCallback( submitNext) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: Keep-Alive', 'Accept-Language: en']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 0', 'X-Foobar: Yes', '\r\n')) return d.addCallback(lambda _: self.assertDone(cxn)) def test_streamedUpload(self): """ Make sure that sending request content works. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('PUT', '/foo', None, 'Helloooo content') d = cxn.client.submitRequest(req).addCallback(self.checkResponse, 202, [], 0, None) self.assertReceived(cxn, 'PUT /foo HTTP/1.1', ['Connection: close', 'Content-Length: 16'], 'Helloooo content') self.writeLines(cxn, ('HTTP/1.1 202 Accepted', 'Content-Length: 0', 'Connection: close', '\r\n')) return d.addCallback(lambda _: self.assertDone(cxn)) def test_sentHead(self): """ Ensure that HEAD requests work, and return Content-Length. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('HEAD', '/', None, None) d = cxn.client.submitRequest(req).addCallback(self.checkResponse, 200, [('Content-Length', ['5'])], 0, None) self.assertReceived(cxn, 'HEAD / HTTP/1.1', ['Connection: close']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Connection: close', 'Content-Length: 5', '', 'Pants')) # bad server return d.addCallback(lambda _: self.assertDone(cxn)) def test_sentHeadKeepAlive(self): """ Ensure that keepalive works right after a HEAD request. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('HEAD', '/', None, None) didIt = [0] def gotData(data): self.assertEquals(data, None) def gotResp(resp): self.assertEquals(resp.code, 200) self.assertEquals(resp.stream.length, 0) self.assertHeaders(resp, []) return defer.maybeDeferred(resp.stream.read).addCallback(gotData) def submitRequest(second): if didIt[0]: return didIt[0] = second if second: keepAlive='close' else: keepAlive='Keep-Alive' cxn.server.data = '' d = cxn.client.submitRequest(req, closeAfter=second).addCallback( self.checkResponse, 200, [('Content-Length', ['5'])], 0, None) self.assertReceived(cxn, 'HEAD / HTTP/1.1', ['Connection: '+ keepAlive]) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Connection: '+ keepAlive, 'Content-Length: 5', '\r\n')) return d.addCallback(lambda _: submitRequest(1)) d = submitRequest(0) return d.addCallback(lambda _: self.assertDone(cxn)) def test_chunkedUpload(self): """ Ensure chunked data is correctly decoded on upload. """ cxn = self.connect(inputTimeOut=None) data = 'Foo bar baz bax' s = stream.ProducerStream(length=None) s.write(data) req = http.ClientRequest('PUT', '/', None, s) d = cxn.client.submitRequest(req) s.finish() self.assertReceived(cxn, 'PUT / HTTP/1.1', ['Connection: close', 'Transfer-Encoding: chunked'], '%X\r\n%s\r\n0\r\n\r\n' % (len(data), data)) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Connection: close', 'Content-Length: 0', '\r\n')) return d.addCallback(lambda _: self.assertDone(cxn)) class TestEdgeCases(ClientTests): def test_serverDoesntSendConnectionClose(self): """ Check that a lost connection is treated as end of response, if we requested connection: close, even if the server didn't respond with connection: close. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req).addCallback(self.checkResponse, 200, [], None, 'Some Content') self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close']) self.writeLines(cxn, ('HTTP/1.1 200 OK', '', 'Some Content')) return d.addCallback(lambda _: self.assertDone(cxn)) def test_serverIsntHttp(self): """ Check that an error is returned if the server doesn't talk HTTP. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) def gotResp(r): print r d = cxn.client.submitRequest(req).addCallback(gotResp) self.assertFailure(d, http.ProtocolError) self.writeLines(cxn, ('HTTP-NG/1.1 200 OK', '\r\n')) def test_newServer(self): """ Check that an error is returned if the server is a new major version. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req) self.assertFailure(d, http.ProtocolError) self.writeLines(cxn, ('HTTP/2.3 200 OK', '\r\n')) def test_shortStatus(self): """ Check that an error is returned if the response line is invalid. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req) self.assertFailure(d, http.ProtocolError) self.writeLines(cxn, ('HTTP/1.1 200', '\r\n')) def test_errorReadingRequestStream(self): """ Ensure that stream errors are propagated to the response. """ cxn = self.connect(inputTimeOut=None) s = stream.ProducerStream() s.write('Foo') req = http.ClientRequest('GET', '/', None, s) d = cxn.client.submitRequest(req) s.finish(IOError('Test Error')) return self.assertFailure(d, IOError) def test_connectionLost(self): """ Check that closing the connection is propagated to the response deferred. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close']) cxn.client.connectionLost(ValueError("foo")) return self.assertFailure(d, ValueError) def test_connectionLostAfterHeaders(self): """ Test that closing the connection after headers are sent is propagated to the response stream. """ cxn = self.connect(inputTimeOut=None) req = http.ClientRequest('GET', '/', None, None) d = cxn.client.submitRequest(req) self.assertReceived(cxn, 'GET / HTTP/1.1', ['Connection: close']) self.writeLines(cxn, ('HTTP/1.1 200 OK', 'Content-Length: 10', 'Connection: close', '\r\n')) cxn.client.connectionLost(ValueError("foo")) def cb(response): return self.assertFailure(response.stream.read(), ValueError) d.addCallback(cb) return d TwistedWeb2-8.1.0/twisted/web2/test/test_http.py0000644000175000017500000012143710757520064020244 0ustar dokodoko from __future__ import nested_scopes import time, sys from zope.interface import implements from twisted.trial import unittest from twisted.web2 import http, http_headers, responsecode, error, iweb, stream from twisted.web2 import channel from twisted.internet import reactor, protocol, address, interfaces, utils from twisted.internet import defer from twisted.internet.defer import waitForDeferred, deferredGenerator from twisted.protocols import loopback from twisted.python import util, runtime from twisted.internet.task import deferLater class PreconditionTestCase(unittest.TestCase): def checkPreconditions(self, request, response, expectedResult, expectedCode, **kw): preconditionsPass = True try: http.checkPreconditions(request, response, **kw) except http.HTTPError, e: preconditionsPass = False self.assertEquals(e.response.code, expectedCode) self.assertEquals(preconditionsPass, expectedResult) def testWithoutHeaders(self): request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) out_headers = http_headers.Headers() response = http.Response(responsecode.OK, out_headers, None) self.checkPreconditions(request, response, True, responsecode.OK) out_headers.setHeader("ETag", http_headers.ETag('foo')) self.checkPreconditions(request, response, True, responsecode.OK) out_headers.removeHeader("ETag") out_headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT self.checkPreconditions(request, response, True, responsecode.OK) out_headers.setHeader("ETag", http_headers.ETag('foo')) self.checkPreconditions(request, response, True, responsecode.OK) def testIfMatch(self): request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) out_headers = http_headers.Headers() response = http.Response(responsecode.OK, out_headers, None) # Behavior with no ETag set, should be same as with an ETag request.headers.setRawHeaders("If-Match", ('*',)) self.checkPreconditions(request, response, True, responsecode.OK) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED, entityExists=False) # Ask for tag, but no etag set. request.headers.setRawHeaders("If-Match", ('"frob"',)) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) ## Actually set the ETag header out_headers.setHeader("ETag", http_headers.ETag('foo')) out_headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT # behavior of entityExists request.headers.setRawHeaders("If-Match", ('*',)) self.checkPreconditions(request, response, True, responsecode.OK) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED, entityExists=False) # tag matches request.headers.setRawHeaders("If-Match", ('"frob", "foo"',)) self.checkPreconditions(request, response, True, responsecode.OK) # none match request.headers.setRawHeaders("If-Match", ('"baz", "bob"',)) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) # But if we have an error code already, ignore this header response.code = responsecode.INTERNAL_SERVER_ERROR self.checkPreconditions(request, response, True, responsecode.INTERNAL_SERVER_ERROR) response.code = responsecode.OK # Must only compare strong tags out_headers.setHeader("ETag", http_headers.ETag('foo', weak=True)) request.headers.setRawHeaders("If-Match", ('W/"foo"',)) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) def testIfUnmodifiedSince(self): request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) out_headers = http_headers.Headers() response = http.Response(responsecode.OK, out_headers, None) # No Last-Modified => always fail. request.headers.setRawHeaders("If-Unmodified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) # Set output headers out_headers.setHeader("ETag", http_headers.ETag('foo')) out_headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT request.headers.setRawHeaders("If-Unmodified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) request.headers.setRawHeaders("If-Unmodified-Since", ('Sat, 01 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) # But if we have an error code already, ignore this header response.code = responsecode.INTERNAL_SERVER_ERROR self.checkPreconditions(request, response, True, responsecode.INTERNAL_SERVER_ERROR) response.code = responsecode.OK # invalid date => header ignored request.headers.setRawHeaders("If-Unmodified-Since", ('alalalalalalalalalala',)) self.checkPreconditions(request, response, True, responsecode.OK) def testIfModifiedSince(self): if time.time() < 946771200: self.fail(RuntimeError("Your computer's clock is way wrong, " "this test will be invalid.")) request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) out_headers = http_headers.Headers() response = http.Response(responsecode.OK, out_headers, None) # No Last-Modified => always succeed request.headers.setRawHeaders("If-Modified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) # Set output headers out_headers.setHeader("ETag", http_headers.ETag('foo')) out_headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT request.headers.setRawHeaders("If-Modified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, False, responsecode.NOT_MODIFIED) # With a non-GET method request.method="PUT" self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) request.method="GET" request.headers.setRawHeaders("If-Modified-Since", ('Sat, 01 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) # But if we have an error code already, ignore this header response.code = responsecode.INTERNAL_SERVER_ERROR self.checkPreconditions(request, response, True, responsecode.INTERNAL_SERVER_ERROR) response.code = responsecode.OK # invalid date => header ignored request.headers.setRawHeaders("If-Modified-Since", ('alalalalalalalalalala',)) self.checkPreconditions(request, response, True, responsecode.OK) # date in the future => assume modified request.headers.setHeader("If-Modified-Since", time.time() + 500) self.checkPreconditions(request, response, True, responsecode.OK) def testIfNoneMatch(self): request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) out_headers = http_headers.Headers() response = http.Response(responsecode.OK, out_headers, None) request.headers.setRawHeaders("If-None-Match", ('"foo"',)) self.checkPreconditions(request, response, True, responsecode.OK) out_headers.setHeader("ETag", http_headers.ETag('foo')) out_headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT # behavior of entityExists request.headers.setRawHeaders("If-None-Match", ('*',)) request.method="PUT" self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) request.method="GET" self.checkPreconditions(request, response, False, responsecode.NOT_MODIFIED) self.checkPreconditions(request, response, True, responsecode.OK, entityExists=False) # tag matches request.headers.setRawHeaders("If-None-Match", ('"frob", "foo"',)) request.method="PUT" self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) request.method="GET" self.checkPreconditions(request, response, False, responsecode.NOT_MODIFIED) # now with IMS, also: request.headers.setRawHeaders("If-Modified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) request.method="PUT" self.checkPreconditions(request, response, False, responsecode.PRECONDITION_FAILED) request.method="GET" self.checkPreconditions(request, response, False, responsecode.NOT_MODIFIED) request.headers.setRawHeaders("If-Modified-Since", ('Sat, 01 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) request.headers.removeHeader("If-Modified-Since") # none match request.headers.setRawHeaders("If-None-Match", ('"baz", "bob"',)) self.checkPreconditions(request, response, True, responsecode.OK) # now with IMS, also: request.headers.setRawHeaders("If-Modified-Since", ('Mon, 03 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) request.headers.setRawHeaders("If-Modified-Since", ('Sat, 01 Jan 2000 00:00:00 GMT',)) self.checkPreconditions(request, response, True, responsecode.OK) request.headers.removeHeader("If-Modified-Since") # But if we have an error code already, ignore this header response.code = responsecode.INTERNAL_SERVER_ERROR self.checkPreconditions(request, response, True, responsecode.INTERNAL_SERVER_ERROR) response.code = responsecode.OK # Weak tags okay for GET out_headers.setHeader("ETag", http_headers.ETag('foo', weak=True)) request.headers.setRawHeaders("If-None-Match", ('W/"foo"',)) self.checkPreconditions(request, response, False, responsecode.NOT_MODIFIED) # Weak tags not okay for other methods request.method="PUT" out_headers.setHeader("ETag", http_headers.ETag('foo', weak=True)) request.headers.setRawHeaders("If-None-Match", ('W/"foo"',)) self.checkPreconditions(request, response, True, responsecode.OK) def testNoResponse(self): # Ensure that passing etag/lastModified arguments instead of response works. request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) request.method="PUT" request.headers.setRawHeaders("If-None-Match", ('"foo"',)) self.checkPreconditions(request, None, True, responsecode.OK) self.checkPreconditions(request, None, False, responsecode.PRECONDITION_FAILED, etag=http_headers.ETag('foo'), lastModified=946771200) # Make sure that, while you shoudn't do this, that it doesn't cause an error request.method="GET" self.checkPreconditions(request, None, False, responsecode.NOT_MODIFIED, etag=http_headers.ETag('foo')) class IfRangeTestCase(unittest.TestCase): def testIfRange(self): request = http.Request(None, "GET", "/", "HTTP/1.1", 0, http_headers.Headers()) response = TestResponse() self.assertEquals(http.checkIfRange(request, response), True) request.headers.setRawHeaders("If-Range", ('"foo"',)) self.assertEquals(http.checkIfRange(request, response), False) response.headers.setHeader("ETag", http_headers.ETag('foo')) self.assertEquals(http.checkIfRange(request, response), True) request.headers.setRawHeaders("If-Range", ('"bar"',)) response.headers.setHeader("ETag", http_headers.ETag('foo')) self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('W/"foo"',)) response.headers.setHeader("ETag", http_headers.ETag('foo', weak=True)) self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('"foo"',)) response.headers.removeHeader("ETag") self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('Sun, 02 Jan 2000 00:00:00 GMT',)) response.headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT self.assertEquals(http.checkIfRange(request, response), True) request.headers.setRawHeaders("If-Range", ('Sun, 02 Jan 2000 00:00:01 GMT',)) response.headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('Sun, 01 Jan 2000 23:59:59 GMT',)) response.headers.setHeader("Last-Modified", 946771200) # Sun, 02 Jan 2000 00:00:00 GMT self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('Sun, 01 Jan 2000 23:59:59 GMT',)) response.headers.removeHeader("Last-Modified") self.assertEquals(http.checkIfRange(request, response), False) request.headers.setRawHeaders("If-Range", ('jwerlqjL#$Y*KJAN',)) self.assertEquals(http.checkIfRange(request, response), False) class LoopbackRelay(loopback.LoopbackRelay): implements(interfaces.IProducer) def pauseProducing(self): self.paused = True def resumeProducing(self): self.paused = False def stopProducing(self): self.loseConnection() def loseWriteConnection(self): # HACK. self.loseConnection() class TestRequest(http.Request): def __init__(self, *args, **kwargs): http.Request.__init__(self, *args, **kwargs) self.cmds = [] headers = list(self.headers.getAllRawHeaders()) headers.sort() self.cmds.append(('init', self.method, self.uri, self.clientproto, self.stream.length, tuple(headers))) def process(self): pass def handleContentChunk(self, data): self.cmds.append(('contentChunk', data)) def handleContentComplete(self): self.cmds.append(('contentComplete',)) def connectionLost(self, reason): self.cmds.append(('connectionLost', reason)) class TestResponse(object): implements(iweb.IResponse) code = responsecode.OK headers = None def __init__(self): self.headers = http_headers.Headers() self.stream = stream.ProducerStream() def write(self, data): self.stream.write(data) def finish(self): self.stream.finish() class TestClient(protocol.Protocol): data = "" done = False def dataReceived(self, data): self.data+=data def write(self, data): self.transport.write(data) def connectionLost(self, reason): self.done = True self.transport.loseConnection() def loseConnection(self): self.done = True self.transport.loseConnection() class TestConnection: def __init__(self): self.requests = [] self.client = None self.callLaters = [] def fakeCallLater(self, secs, f): assert secs == 0 self.callLaters.append(f) class HTTPTests(unittest.TestCase): def connect(self, logFile=None, **protocol_kwargs): cxn = TestConnection() def makeTestRequest(*args): cxn.requests.append(TestRequest(*args)) return cxn.requests[-1] factory = channel.HTTPFactory(requestFactory=makeTestRequest, _callLater=cxn.fakeCallLater, **protocol_kwargs) cxn.client = TestClient() cxn.server = factory.buildProtocol(address.IPv4Address('TCP', '127.0.0.1', 2345)) cxn.serverToClient = LoopbackRelay(cxn.client, logFile) cxn.clientToServer = LoopbackRelay(cxn.server, logFile) cxn.server.makeConnection(cxn.serverToClient) cxn.client.makeConnection(cxn.clientToServer) return cxn def iterate(self, cxn): callLaters = cxn.callLaters cxn.callLaters = [] for f in callLaters: f() cxn.serverToClient.clearBuffer() cxn.clientToServer.clearBuffer() if cxn.serverToClient.shouldLose: cxn.serverToClient.clearBuffer() if cxn.clientToServer.shouldLose: cxn.clientToServer.clearBuffer() def compareResult(self, cxn, cmds, data): self.iterate(cxn) for receivedRequest, expectedCommands in map(None, cxn.requests, cmds): sortedHeaderCommands = [] for cmd in expectedCommands: if len(cmd) == 6: sortedHeaders = list(cmd[5]) sortedHeaders.sort() sortedHeaderCommands.append(cmd[:5] + (tuple(sortedHeaders),)) else: sortedHeaderCommands.append(cmd) self.assertEquals(receivedRequest.cmds, sortedHeaderCommands) self.assertEquals(cxn.client.data, data) def assertDone(self, cxn, done=True): self.iterate(cxn) self.assertEquals(cxn.client.done, done) class CoreHTTPTestCase(HTTPTests): # Note: these tests compare the client output using string # matching. It is acceptable for this to change and break # the test if you know what you are doing. def testHTTP0_9(self, nouri=False): cxn = self.connect() cmds = [[]] data = "" if nouri: cxn.client.write("GET\r\n") else: cxn.client.write("GET /\r\n") # Second request which should not be handled cxn.client.write("GET /two\r\n") cmds[0] += [('init', 'GET', '/', (0,9), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Yo", ("One", "Two")) cxn.requests[0].writeResponse(response) response.write("") self.compareResult(cxn, cmds, data) response.write("Output") data += "Output" self.compareResult(cxn, cmds, data) response.finish() self.compareResult(cxn, cmds, data) self.assertDone(cxn) def testHTTP0_9_nouri(self): self.testHTTP0_9(True) def testHTTP1_0(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.0\r\nContent-Length: 5\r\nHost: localhost\r\n\r\nInput") # Second request which should not be handled cxn.client.write("GET /two HTTP/1.0\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,0), 5, (('Host', ['localhost']),)), ('contentChunk', 'Input'), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Yo", ("One", "Two")) cxn.requests[0].writeResponse(response) response.write("") data += "HTTP/1.1 200 OK\r\nYo: One\r\nYo: Two\r\nConnection: close\r\n\r\n" self.compareResult(cxn, cmds, data) response.write("Output") data += "Output" self.compareResult(cxn, cmds, data) response.finish() self.compareResult(cxn, cmds, data) self.assertDone(cxn) def testHTTP1_0_keepalive(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.0\r\nConnection: keep-alive\r\nContent-Length: 5\r\nHost: localhost\r\n\r\nInput") cxn.client.write("GET /two HTTP/1.0\r\n\r\n") # Third request shouldn't be handled cxn.client.write("GET /three HTTP/1.0\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,0), 5, (('Host', ['localhost']),)), ('contentChunk', 'Input'), ('contentComplete',)] self.compareResult(cxn, cmds, data) response0 = TestResponse() response0.headers.setRawHeaders("Content-Length", ("6", )) response0.headers.setRawHeaders("Yo", ("One", "Two")) cxn.requests[0].writeResponse(response0) response0.write("") data += "HTTP/1.1 200 OK\r\nContent-Length: 6\r\nYo: One\r\nYo: Two\r\nConnection: Keep-Alive\r\n\r\n" self.compareResult(cxn, cmds, data) response0.write("Output") data += "Output" self.compareResult(cxn, cmds, data) response0.finish() # Now for second request: cmds.append([]) cmds[1] += [('init', 'GET', '/two', (1,0), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) response1 = TestResponse() response1.headers.setRawHeaders("Content-Length", ("0", )) cxn.requests[1].writeResponse(response1) response1.write("") data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\nConnection: close\r\n\r\n" self.compareResult(cxn, cmds, data) response1.finish() self.assertDone(cxn) def testHTTP1_1_pipelining(self): cxn = self.connect(maxPipeline=2) cmds = [] data = "" # Both these show up immediately. cxn.client.write("GET / HTTP/1.1\r\nContent-Length: 5\r\nHost: localhost\r\n\r\nInput") cxn.client.write("GET /two HTTP/1.1\r\nHost: localhost\r\n\r\n") # Doesn't show up until the first is done. cxn.client.write("GET /three HTTP/1.1\r\nHost: localhost\r\n\r\n") # Doesn't show up until the second is done. cxn.client.write("GET /four HTTP/1.1\r\nHost: localhost\r\n\r\n") cmds.append([]) cmds[0] += [('init', 'GET', '/', (1,1), 5, (('Host', ['localhost']),)), ('contentChunk', 'Input'), ('contentComplete',)] cmds.append([]) cmds[1] += [('init', 'GET', '/two', (1,1), 0, (('Host', ['localhost']),)), ('contentComplete',)] self.compareResult(cxn, cmds, data) response0 = TestResponse() response0.headers.setRawHeaders("Content-Length", ("6", )) cxn.requests[0].writeResponse(response0) response0.write("") data += "HTTP/1.1 200 OK\r\nContent-Length: 6\r\n\r\n" self.compareResult(cxn, cmds, data) response0.write("Output") data += "Output" self.compareResult(cxn, cmds, data) response0.finish() # Now the third request gets read: cmds.append([]) cmds[2] += [('init', 'GET', '/three', (1,1), 0, (('Host', ['localhost']),)), ('contentComplete',)] self.compareResult(cxn, cmds, data) # Let's write out the third request before the second. # This should not cause anything to be written to the client. response2 = TestResponse() response2.headers.setRawHeaders("Content-Length", ("5", )) cxn.requests[2].writeResponse(response2) response2.write("Three") response2.finish() self.compareResult(cxn, cmds, data) response1 = TestResponse() response1.headers.setRawHeaders("Content-Length", ("3", )) cxn.requests[1].writeResponse(response1) response1.write("Two") data += "HTTP/1.1 200 OK\r\nContent-Length: 3\r\n\r\nTwo" self.compareResult(cxn, cmds, data) response1.finish() # Fourth request shows up cmds.append([]) cmds[3] += [('init', 'GET', '/four', (1,1), 0, (('Host', ['localhost']),)), ('contentComplete',)] data += "HTTP/1.1 200 OK\r\nContent-Length: 5\r\n\r\nThree" self.compareResult(cxn, cmds, data) response3 = TestResponse() response3.headers.setRawHeaders("Content-Length", ("0",)) cxn.requests[3].writeResponse(response3) response3.finish() data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n" self.compareResult(cxn, cmds, data) self.assertDone(cxn, done=False) cxn.client.loseConnection() self.assertDone(cxn) def testHTTP1_1_chunking(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nTransfer-Encoding: chunked\r\nHost: localhost\r\n\r\n5\r\nInput\r\n") cmds[0] += [('init', 'GET', '/', (1,1), None, (('Host', ['localhost']),)), ('contentChunk', 'Input')] self.compareResult(cxn, cmds, data) cxn.client.write("1; blahblahblah\r\na\r\n10\r\nabcdefghijklmnop\r\n") cmds[0] += [('contentChunk', 'a'),('contentChunk', 'abcdefghijklmnop')] self.compareResult(cxn, cmds, data) cxn.client.write("0\r\nRandom-Ignored-Trailer: foo\r\n\r\n") cmds[0] += [('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() cxn.requests[0].writeResponse(response) response.write("Output") data += "HTTP/1.1 200 OK\r\nTransfer-Encoding: chunked\r\n\r\n6\r\nOutput\r\n" self.compareResult(cxn, cmds, data) response.write("blahblahblah") data += "C\r\nblahblahblah\r\n" self.compareResult(cxn, cmds, data) response.finish() data += "0\r\n\r\n" self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) def testHTTP1_1_expect_continue(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nContent-Length: 5\r\nHost: localhost\r\nExpect: 100-continue\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 5, (('Expect', ['100-continue']), ('Host', ['localhost'])))] self.compareResult(cxn, cmds, data) cxn.requests[0].stream.read() data += "HTTP/1.1 100 Continue\r\n\r\n" self.compareResult(cxn, cmds, data) cxn.client.write("Input") cmds[0] += [('contentChunk', 'Input'), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Content-Length", ("6",)) cxn.requests[0].writeResponse(response) response.write("Output") response.finish() data += "HTTP/1.1 200 OK\r\nContent-Length: 6\r\n\r\nOutput" self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) def testHTTP1_1_expect_continue_early_reply(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nContent-Length: 5\r\nHost: localhost\r\nExpect: 100-continue\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 5, (('Host', ['localhost']), ('Expect', ['100-continue'])))] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Content-Length", ("6",)) cxn.requests[0].writeResponse(response) response.write("Output") response.finish() cmds[0] += [('contentComplete',)] data += "HTTP/1.1 200 OK\r\nContent-Length: 6\r\nConnection: close\r\n\r\nOutput" self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) def testHeaderContinuation(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nHost: localhost\r\nFoo: yada\r\n yada\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 0, (('Host', ['localhost']), ('Foo', ['yada yada']),)), ('contentComplete',)] self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) def testTimeout_immediate(self): # timeout 0 => timeout on first iterate call cxn = self.connect(inputTimeOut = 0) return deferLater(reactor, 0, self.assertDone, cxn) def testTimeout_inRequest(self): cxn = self.connect(inputTimeOut = 0.3) cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\n") return deferLater(reactor, 0.5, self.assertDone, cxn) def testTimeout_betweenRequests(self): cxn = self.connect(betweenRequestsTimeOut = 0.3) cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Content-Length", ("0",)) cxn.requests[0].writeResponse(response) response.finish() data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n" self.compareResult(cxn, cmds, data) return deferLater(reactor, 0.5, self.assertDone, cxn) # Wait for timeout def testConnectionCloseRequested(self): cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) cxn.client.write("GET / HTTP/1.1\r\nConnection: close\r\n\r\n") cmds.append([]) cmds[1] += [('init', 'GET', '/', (1,1), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.headers.setRawHeaders("Content-Length", ("0",)) cxn.requests[0].writeResponse(response) response.finish() data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n" response = TestResponse() response.headers.setRawHeaders("Content-Length", ("0",)) cxn.requests[1].writeResponse(response) response.finish() data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\nConnection: close\r\n\r\n" self.compareResult(cxn, cmds, data) self.assertDone(cxn) def testExtraCRLFs(self): cxn = self.connect() cmds = [[]] data = "" # Some broken clients (old IEs) send an extra CRLF after post cxn.client.write("POST / HTTP/1.1\r\nContent-Length: 5\r\nHost: localhost\r\n\r\nInput\r\n") cmds[0] += [('init', 'POST', '/', (1,1), 5, (('Host', ['localhost']),)), ('contentChunk', 'Input'), ('contentComplete',)] self.compareResult(cxn, cmds, data) cxn.client.write("GET /two HTTP/1.1\r\n\r\n") cmds.append([]) cmds[1] += [('init', 'GET', '/two', (1,1), 0, ()), ('contentComplete',)] self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) def testDisallowPersistentConnections(self): cxn = self.connect(allowPersistentConnections=False) cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nHost: localhost\r\n\r\nGET / HTTP/1.1\r\nHost: localhost\r\n\r\n") cmds[0] += [('init', 'GET', '/', (1,1), 0, (('Host', ['localhost']),)), ('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.finish() cxn.requests[0].writeResponse(response) data += 'HTTP/1.1 200 OK\r\nContent-Length: 0\r\nConnection: close\r\n\r\n' self.compareResult(cxn, cmds, data) self.assertDone(cxn) def testIgnoreBogusContentLength(self): # Ensure that content-length is ignored when transfer-encoding # is also specified. cxn = self.connect() cmds = [[]] data = "" cxn.client.write("GET / HTTP/1.1\r\nContent-Length: 100\r\nTransfer-Encoding: chunked\r\nHost: localhost\r\n\r\n5\r\nInput\r\n") cmds[0] += [('init', 'GET', '/', (1,1), None, (('Host', ['localhost']),)), ('contentChunk', 'Input')] self.compareResult(cxn, cmds, data) cxn.client.write("0\r\n\r\n") cmds[0] += [('contentComplete',)] self.compareResult(cxn, cmds, data) response = TestResponse() response.finish() cxn.requests[0].writeResponse(response) data += "HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n" self.compareResult(cxn, cmds, data) cxn.client.loseConnection() self.assertDone(cxn) class ErrorTestCase(HTTPTests): def assertStartsWith(self, first, second, msg=None): self.assert_(first.startswith(second), '%r.startswith(%r)' % (first, second)) def checkError(self, cxn, code): self.iterate(cxn) self.assertStartsWith(cxn.client.data, "HTTP/1.1 %d "%code) self.assertIn("\r\nConnection: close\r\n", cxn.client.data) # Ensure error messages have a defined content-length. self.assertIn("\r\nContent-Length:", cxn.client.data) self.assertDone(cxn) def testChunkingError1(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\nTransfer-Encoding: chunked\r\n\r\nasdf\r\n") self.checkError(cxn, 400) def testChunkingError2(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\nTransfer-Encoding: chunked\r\n\r\n1\r\nblahblah\r\n") self.checkError(cxn, 400) def testChunkingError3(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\nTransfer-Encoding: chunked\r\n\r\n-1\r\nasdf\r\n") self.checkError(cxn, 400) def testTooManyHeaders(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\n") cxn.client.write("Foo: Bar\r\n"*5000) self.checkError(cxn, 400) def testLineTooLong(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\n") cxn.client.write("Foo: "+("Bar"*10000)) self.checkError(cxn, 400) def testLineTooLong2(self): cxn = self.connect() cxn.client.write("GET "+("/Bar")*10000 +" HTTP/1.1\r\n") self.checkError(cxn, 414) def testNoColon(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\n") cxn.client.write("Blahblah\r\n\r\n") self.checkError(cxn, 400) def testBadRequest(self): cxn = self.connect() cxn.client.write("GET / more HTTP/1.1\r\n") self.checkError(cxn, 400) def testWrongProtocol(self): cxn = self.connect() cxn.client.write("GET / Foobar/1.0\r\n") self.checkError(cxn, 400) def testBadProtocolVersion(self): cxn = self.connect() cxn.client.write("GET / HTTP/1\r\n") self.checkError(cxn, 400) def testBadProtocolVersion2(self): cxn = self.connect() cxn.client.write("GET / HTTP/-1.0\r\n") self.checkError(cxn, 400) def testWrongProtocolVersion(self): cxn = self.connect() cxn.client.write("GET / HTTP/2.0\r\n") self.checkError(cxn, 505) def testUnsupportedTE(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\n") cxn.client.write("Transfer-Encoding: blahblahblah, chunked\r\n\r\n") self.checkError(cxn, 501) def testTEWithoutChunked(self): cxn = self.connect() cxn.client.write("GET / HTTP/1.1\r\n") cxn.client.write("Transfer-Encoding: gzip\r\n\r\n") self.checkError(cxn, 400) class PipelinedErrorTestCase(ErrorTestCase): # Make sure that even low level reading errors don't corrupt the data stream, # but always wait until their turn to respond. def connect(self): cxn = ErrorTestCase.connect(self) cxn.client.write("GET / HTTP/1.1\r\nHost: localhost\r\n\r\n") cmds = [[('init', 'GET', '/', (1,1), 0, (('Host', ['localhost']),)), ('contentComplete', )]] data = "" self.compareResult(cxn, cmds, data) return cxn def checkError(self, cxn, code): self.iterate(cxn) self.assertEquals(cxn.client.data, '') response = TestResponse() response.headers.setRawHeaders("Content-Length", ("0",)) cxn.requests[0].writeResponse(response) response.write('') data = "HTTP/1.1 200 OK\r\nContent-Length: 0\r\n\r\n" self.iterate(cxn) self.assertEquals(cxn.client.data, data) # Reset the data so the checkError's startswith test can work right. cxn.client.data = "" response.finish() ErrorTestCase.checkError(self, cxn, code) class SimpleFactory(channel.HTTPFactory): def buildProtocol(self, addr): # Do a bunch of crazy crap just so that the test case can know when the # connection is done. p = channel.HTTPFactory.buildProtocol(self, addr) cl = p.connectionLost def newCl(reason): reactor.callLater(0, lambda: self.testcase.connlost.callback(None)) return cl(reason) p.connectionLost = newCl self.conn = p return p class SimpleRequest(http.Request): def process(self): response = TestResponse() if self.uri == "/error": response.code=402 else: response.code=404 response.write("URI %s unrecognized." % self.uri) response.finish() self.writeResponse(response) class AbstractServerTestMixin: type = None def testBasicWorkingness(self): args = ('-u', util.sibpath(__file__, "simple_client.py"), "basic", str(self.port), self.type) d = waitForDeferred(utils.getProcessOutputAndValue(sys.executable, args=args)) yield d; out,err,code = d.getResult() self.assertEquals(code, 0, "Error output:\n%s" % (err,)) self.assertEquals(out, "HTTP/1.1 402 Payment Required\r\nContent-Length: 0\r\nConnection: close\r\n\r\n") testBasicWorkingness = deferredGenerator(testBasicWorkingness) def testLingeringClose(self): args = ('-u', util.sibpath(__file__, "simple_client.py"), "lingeringClose", str(self.port), self.type) d = waitForDeferred(utils.getProcessOutputAndValue(sys.executable, args=args)) yield d; out,err,code = d.getResult() self.assertEquals(code, 0, "Error output:\n%s" % (err,)) self.assertEquals(out, "HTTP/1.1 402 Payment Required\r\nContent-Length: 0\r\nConnection: close\r\n\r\n") testLingeringClose = deferredGenerator(testLingeringClose) class TCPServerTest(unittest.TestCase, AbstractServerTestMixin): type = 'tcp' def setUp(self): factory=SimpleFactory(requestFactory=SimpleRequest) factory.testcase = self self.factory = factory self.connlost = defer.Deferred() self.socket = reactor.listenTCP(0, factory) self.port = self.socket.getHost().port def tearDown(self): # Make sure the listening port is closed d = defer.maybeDeferred(self.socket.stopListening) def finish(v): # And make sure the established connection is, too self.factory.conn.transport.loseConnection() return self.connlost return d.addCallback(finish) try: from twisted.internet import ssl except ImportError: # happens the first time the interpreter tries to import it ssl = None if ssl and not ssl.supported: # happens second and later times ssl = None certPath = util.sibpath(__file__, "server.pem") class SSLServerTest(unittest.TestCase, AbstractServerTestMixin): type = 'ssl' def setUp(self): sCTX = ssl.DefaultOpenSSLContextFactory(certPath, certPath) factory=SimpleFactory(requestFactory=SimpleRequest) factory.testcase = self self.factory = factory self.connlost = defer.Deferred() self.socket = reactor.listenSSL(0, factory, sCTX) self.port = self.socket.getHost().port def tearDown(self): # Make sure the listening port is closed d = defer.maybeDeferred(self.socket.stopListening) def finish(v): # And make sure the established connection is, too self.factory.conn.transport.loseConnection() return self.connlost return d.addCallback(finish) def testLingeringClose(self): return super(SSLServerTest, self).testLingeringClose() if runtime.platform.isWindows(): # This may not just be Windows, but all platforms with more recent # versions of OpenSSL. Do some more experimentation... testLingeringClose.todo = "buffering kills the connection too early; test this some other way" if interfaces.IReactorProcess(reactor, None) is None: TCPServerTest.skip = SSLServerTest.skip = "Required process support missing from reactor" elif interfaces.IReactorSSL(reactor, None) is None: SSLServerTest.skip = "Required SSL support missing from reactor" elif ssl is None: SSLServerTest.skip = "SSL not available, cannot test SSL." TwistedWeb2-8.1.0/twisted/web2/test/test_fastcgi.py0000644000175000017500000000355010457627450020704 0ustar dokodoko from twisted.trial import unittest from twisted.web2.channel import fastcgi from twisted.python import util class FCGI(unittest.TestCase): def testPacketReceived(self): ''' Test that a packet can be received, and that it will cause 'writePacket' to be called. ''' record = fastcgi.Record(fastcgi.FCGI_GET_VALUES, 0, '') req = fastcgi.FastCGIChannelRequest() called = [] def writePacket(rec): self.assertEquals(rec.__class__, fastcgi.Record) called.append(rec) req.writePacket = writePacket req.packetReceived(record) self.assertEquals(len(called), 1) def testPacketWrongVersion(self): ''' Test that a version other than version 1 will raise FastCGIError ''' record = fastcgi.Record(fastcgi.FCGI_GET_VALUES, 0, '', version=2) req = fastcgi.FastCGIChannelRequest() self.failUnless(util.raises(fastcgi.FastCGIError, req.packetReceived, record)) def testPacketBadType(self): ''' Test that an invalid packet type will raise FastCGIError ''' record = fastcgi.Record(99999, 0, '') req = fastcgi.FastCGIChannelRequest() self.failUnless(util.raises(fastcgi.FastCGIError, req.packetReceived, record)) def testParseLongName(self): ''' Test the code paths for parsing a name or value with >= 128 bytes. The length prefixing is done differently in this case. ''' self.assertEqual( [('x'*128, 'y')], list(fastcgi.parseNameValues(fastcgi.writeNameValue('x'*128, 'y')))) def testParseLongValue(self): ''' Test parsing a long value. ''' self.assertEqual( [('x', 'y'*128)], list(fastcgi.parseNameValues(fastcgi.writeNameValue('x', 'y'*128)))) TwistedWeb2-8.1.0/twisted/web2/test/test_log.py0000644000175000017500000001061710563036667020051 0ustar dokodoko# Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. from twisted.web2 import log, resource, http from twisted.web2.test.test_server import BaseCase, BaseTestResource from twisted.python import log as tlog class BufferingLogObserver(log.BaseCommonAccessLoggingObserver): """ A web2 log observer that buffer messages. """ messages = [] def logMessage(self, message): self.messages.append(message) class SetDateWrapperResource(resource.WrapperResource): """ A resource wrapper which sets the date header. """ def hook(self, req): def _filter(req, resp): resp.headers.setHeader('date', 0.0) return resp _filter.handleErrors = True req.addResponseFilter(_filter, atEnd=True) class NoneStreamResource(resource.Resource): """ A basic empty resource. """ def render(self, req): return http.Response(200) class TestLogging(BaseCase): def setUp(self): self.blo = BufferingLogObserver() tlog.addObserver(self.blo.emit) # some default resource setup self.resrc = BaseTestResource() self.resrc.child_emptystream = NoneStreamResource() self.root = SetDateWrapperResource(log.LogWrapperResource(self.resrc)) def tearDown(self): tlog.removeObserver(self.blo.emit) def assertLogged(self, **expected): """ Check that logged messages matches expected format. """ if 'date' not in expected: epoch = log.BaseCommonAccessLoggingObserver().logDateString(0) expected['date'] = epoch if 'user' not in expected: expected['user'] = '-' if 'referer' not in expected: expected['referer'] = '-' if 'user-agent' not in expected: expected['user-agent'] = '-' if 'version' not in expected: expected['version'] = '1.1' if 'remotehost' not in expected: expected['remotehost'] = 'remotehost' messages = self.blo.messages[:] del self.blo.messages[:] expectedLog = ('%(remotehost)s - %(user)s [%(date)s] "%(method)s ' '%(uri)s HTTP/%(version)s" %(status)d %(length)d ' '"%(referer)s" "%(user-agent)s"') if expected.get('logged', True): # Ensure there weren't other messages hanging out self.assertEquals(len(messages), 1, "len(%r) != 1" % (messages, )) self.assertEquals(messages[0], expectedLog % expected) else: self.assertEquals(len(messages), 0, "len(%r) != 0" % (messages, )) def test_logSimpleRequest(self): """ Check the log for a simple request. """ uri = 'http://localhost/' method = 'GET' def _cbCheckLog(response): self.assertLogged(method=method, uri=uri, status=response[0], length=response[1].getHeader('content-length')) d = self.getResponseFor(self.root, uri, method=method) d.addCallback(_cbCheckLog) return d def test_logErrors(self): """ Test the error log. """ def test(_, uri, method, **expected): expected['uri'] = uri expected['method'] = method def _cbCheckLog(response): self.assertEquals(response[0], expected['status']) self.assertLogged( length=response[1].getHeader('content-length'), **expected) return self.getResponseFor(self.root, uri, method=method).addCallback(_cbCheckLog) uri = 'http://localhost/foo' # doesn't exist method = 'GET' d = test(None, uri, method, status=404, logged=True) # no host. this should result in a 400 which doesn't get logged uri = 'http:///' d.addCallback(test, uri, method, status=400, logged=False) return d def test_logNoneResponseStream(self): """ Test the log of an empty resource. """ uri = 'http://localhost/emptystream' method = 'GET' def _cbCheckLog(response): self.assertLogged(method=method, uri=uri, status=200, length=0) d = self.getResponseFor(self.root, uri, method=method) d.addCallback(_cbCheckLog) return d TwistedWeb2-8.1.0/twisted/web2/test/test_compat.py0000644000175000017500000000325410336433563020544 0ustar dokodokofrom twisted.web2.test.test_server import BaseCase import sys try: from twisted.web import resource class OldWebResource(resource.Resource): def __init__(self, message, *args, **kwargs): self.message = message resource.Resource.__init__(self, *args, **kwargs) isLeaf = True def render(self, req): return self.message except ImportError: resource = None class OldWebCompat(BaseCase): try: import twisted.web except ImportError: skip = "can't run w/o twisted.web" def testOldWebResource(self): ow = OldWebResource('I am an OldWebResource') self.assertResponse((ow, "http://localhost/"), (200, {}, 'I am an OldWebResource')) def testOldWebResourceNotLeaf(self): ow = OldWebResource('I am not a leaf') ow.isLeaf = False self.assertResponse((ow, "http://localhost/"), (200, {}, 'I am not a leaf')) def testOldWebResourceWithChildren(self): ow = OldWebResource('I am an OldWebResource with a child') ow.isLeaf = False ow.putChild('child', OldWebResource('I am a child of an OldWebResource')) self.assertResponse((ow, "http://localhost/"), (200, {}, 'I am an OldWebResource with a child')) self.assertResponse((ow, "http://localhost/child"), (200, {}, 'I am a child of an OldWebResource')) if not resource: OldWebCompat.skip = "can't run w/o twisted.web" TwistedWeb2-8.1.0/twisted/web2/test/test_static.py0000644000175000017500000001133310634573544020552 0ustar dokodokoimport os from twisted.web2.test.test_server import BaseCase from twisted.web2 import static from twisted.web2 import http_headers from twisted.web2 import stream from twisted.web2 import iweb class TestData(BaseCase): def setUp(self): self.text = "Hello, World\n" self.data = static.Data(self.text, "text/plain") def test_dataState(self): """ Test the internal state of the Data object """ self.assert_(hasattr(self.data, "created_time")) self.assertEquals(self.data.data, self.text) self.assertEquals(self.data.type, http_headers.MimeType("text", "plain")) self.assertEquals(self.data.contentType(), http_headers.MimeType("text", "plain")) def test_etag(self): """ Test that we can get an ETag """ self.failUnless(self.data.etag()) def test_render(self): """ Test that the result from Data.render is acceptable, including the response code, the content-type header, and the actual response body itself. """ response = iweb.IResponse(self.data.render(None)) self.assertEqual(response.code, 200) self.assert_(response.headers.hasHeader("content-type")) self.assertEqual(response.headers.getHeader("content-type"), http_headers.MimeType("text", "plain")) def checkStream(data): self.assertEquals(str(data), self.text) return stream.readStream(iweb.IResponse(self.data.render(None)).stream, checkStream) class TestFileSaver(BaseCase): def setUpClass(self): self.tempdir = self.mktemp() os.mkdir(self.tempdir) self.root = static.FileSaver(self.tempdir, expectedFields=['FileNameOne'], maxBytes=16) self.root.addSlash = True def uploadFile(self, fieldname, filename, mimetype, content, resrc=None, host='foo', path='/'): if not resrc: resrc = self.root ctype = http_headers.MimeType('multipart', 'form-data', (('boundary', '---weeboundary'),)) return self.getResponseFor(resrc, '/', headers={'host': 'foo', 'content-type': ctype }, length=len(content), method='POST', content="""-----weeboundary\r Content-Disposition: form-data; name="%s"; filename="%s"\r Content-Type: %s\r \r %s\r -----weeboundary--\r """ % (fieldname, filename, mimetype, content)) def _CbAssertInResponse(self, (code, headers, data, failed), expected_response, expectedFailure=False): expected_code, expected_headers, expected_data = expected_response self.assertEquals(code, expected_code) if expected_data is not None: self.failUnlessSubstring(expected_data, data) for key, value in expected_headers.iteritems(): self.assertEquals(headers.getHeader(key), value) self.assertEquals(failed, expectedFailure) def fileNameFromResponse(self, response): (code, headers, data, failure) = response return data[data.index('Saved file')+11:data.index('
')] def assertInResponse(self, response, expected_response, failure=False): d = response d.addCallback(self._CbAssertInResponse, expected_response, failure) return d def test_enforcesMaxBytes(self): return self.assertInResponse( self.uploadFile('FileNameOne', 'myfilename', 'text/html', 'X'*32), (200, {}, 'exceeds maximum length')) def test_enforcesMimeType(self): return self.assertInResponse( self.uploadFile('FileNameOne', 'myfilename', 'application/x-python', 'X'), (200, {}, 'type not allowed')) def test_invalidField(self): return self.assertInResponse( self.uploadFile('NotARealField', 'myfilename', 'text/html', 'X'), (200, {}, 'not a valid field')) def test_reportFileSave(self): return self.assertInResponse( self.uploadFile('FileNameOne', 'myfilename', 'text/plain', 'X'), (200, {}, 'Saved file')) def test_compareFileContents(self): def gotFname(fname): contents = file(fname, 'rb').read() self.assertEquals(contents, 'Test contents\n') d = self.uploadFile('FileNameOne', 'myfilename', 'text/plain', 'Test contents\n') d.addCallback(self.fileNameFromResponse) d.addCallback(gotFname) return d TwistedWeb2-8.1.0/twisted/web2/test/test_scgi.py0000644000175000017500000000704310442325576020210 0ustar dokodokofrom twisted.trial import unittest from twisted.internet import defer from twisted.internet.address import IPv4Address from twisted.web2.test.test_http import LoopbackRelay, TestConnection from twisted.web2.test.test_http import TestClient, HTTPTests, TestRequest from twisted.web2.test.test_client import TestServer, ClientTests from twisted.web2.test.test_server import SimpleRequest from twisted.web2 import server from twisted.web2 import http_headers from twisted.web2 import stream from twisted.web2 import twscgi def parseSCGIHeaders(headers): return zip(*[iter(headers.split(':', 1)[1].split('\x00'))]*2) class SCGITests(HTTPTests): def connect(self, logFile=None): cxn = TestConnection() cxn.client = self.clientProtocol cxn.server = self.serverProtocol cxn.serverToClient = LoopbackRelay(cxn.client, logFile) cxn.clientToServer = LoopbackRelay(cxn.server, logFile) cxn.server.makeConnection(cxn.serverToClient) cxn.client.makeConnection(cxn.clientToServer) return cxn class SCGIClientTests(SCGITests, ClientTests): def setUp(self): self.serverProtocol = TestServer() def doTestSCGI(self, request): if request.stream.length is None: return http.Response(responsecode.LENGTH_REQUIRED) factory = twscgi.SCGIClientProtocolFactory(request) self.clientProtocol = factory.buildProtocol(None) self.cxn = self.connect() return factory.deferred def testSimpleRequest(self): def gotResponse(resp): self.assertEquals(resp.code, 200) self.assertEquals(resp.headers.getHeader('Content-Type'), http_headers.MimeType.fromString('text/plain')) return defer.maybeDeferred(resp.stream.read ).addCallback(self.assertEquals, '42') req = SimpleRequest(None, 'GET', '/') d = self.doTestSCGI(req) d.addCallback(gotResponse) self.iterate(self.cxn) headers = parseSCGIHeaders(self.cxn.server.data) self.assertEquals(headers[0], ('CONTENT_LENGTH', '0')) self.failUnlessIn(('SCGI', '1'), headers) self.writeLines(self.cxn, ['Status: 200 OK', 'Content-Type: text/plain', 'Content-Length: 2', '', '42']) return d def testOperatesOnStreamDirectly(self): def gotResponse(resp): self.assertEquals(resp.code, 200) self.assertEquals(resp.headers.getHeader('Content-Type'), http_headers.MimeType.fromString('text/plain')) stream = resp.stream resp.stream = None return defer.maybeDeferred(stream.read ).addCallback(self.assertEquals, '42') req = SimpleRequest(None, 'GET', '/') d = self.doTestSCGI(req) d.addCallback(gotResponse) self.iterate(self.cxn) headers = parseSCGIHeaders(self.cxn.server.data) self.assertEquals(headers[0], ('CONTENT_LENGTH', '0')) self.failUnlessIn(('SCGI', '1'), headers) self.writeLines(self.cxn, ['Status: 200 OK', 'Content-Type: text/plain', 'Content-Length: 2', '', '42']) return d TwistedWeb2-8.1.0/twisted/web2/test/test_resource.py0000644000175000017500000001557410767572311021123 0ustar dokodoko# Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ A test harness for twisted.web2.resource. """ from sets import Set as set from zope.interface import implements from twisted.internet.defer import succeed, fail from twisted.trial import unittest from twisted.web2 import responsecode from twisted.web2.iweb import IResource from twisted.web2.http import Response from twisted.web2.stream import MemoryStream from twisted.web2.resource import RenderMixin, LeafResource from twisted.web2.server import Site, StopTraversal from twisted.web2.test.test_server import SimpleRequest class PreconditionError (Exception): "Precondition Failure" class TestResource (RenderMixin): implements(IResource) def _handler(self, request): if request is None: return responsecode.INTERNAL_SERVER_ERROR return responsecode.NO_CONTENT http_BLEARGH = _handler http_HUCKHUCKBLORP = _handler http_SWEETHOOKUPS = _handler http_HOOKUPS = _handler def preconditions_BLEARGH(self, request): raise PreconditionError() def precondition_HUCKHUCKBLORP(self, request): return fail(None) def preconditions_SWEETHOOKUPS(self, request): return None def preconditions_HOOKUPS(self, request): return succeed(None) renderOutput = "Snootch to the hootch" def render(self, request): response = Response() response.stream = MemoryStream(self.renderOutput) return response def generateResponse(method): resource = TestResource() method = getattr(resource, "http_" + method) return method(SimpleRequest(Site(resource), method, "/")) class RenderMixInTestCase (unittest.TestCase): """ Test RenderMixin. """ _my_allowed_methods = set(( "HEAD", "OPTIONS", "TRACE", "GET", "BLEARGH", "HUCKHUCKBLORP", "SWEETHOOKUPS", "HOOKUPS", )) def test_allowedMethods(self): """ RenderMixin.allowedMethods() """ self.assertEquals( set(TestResource().allowedMethods()), self._my_allowed_methods ) def test_checkPreconditions_raises(self): """ RenderMixin.checkPreconditions() Exception raised in checkPreconditions() """ resource = TestResource() request = SimpleRequest(Site(resource), "BLEARGH", "/") # Check that checkPreconditions raises as expected self.assertRaises(PreconditionError, resource.checkPreconditions, request) # Check that renderHTTP calls checkPreconditions self.assertRaises(PreconditionError, resource.renderHTTP, request) def test_checkPreconditions_none(self): """ RenderMixin.checkPreconditions() checkPreconditions() returns None """ resource = TestResource() request = SimpleRequest(Site(resource), "SWEETHOOKUPS", "/") # Check that checkPreconditions without a raise doesn't barf self.assertEquals(resource.renderHTTP(request), responsecode.NO_CONTENT) def test_checkPreconditions_deferred(self): """ RenderMixin.checkPreconditions() checkPreconditions() returns a deferred """ resource = TestResource() request = SimpleRequest(Site(resource), "HOOKUPS", "/") # Check that checkPreconditions without a raise doesn't barf def checkResponse(response): self.assertEquals(response, responsecode.NO_CONTENT) d = resource.renderHTTP(request) d.addCallback(checkResponse) def test_OPTIONS_status(self): """ RenderMixin.http_OPTIONS() Response code is OK """ response = generateResponse("OPTIONS") self.assertEquals(response.code, responsecode.OK) def test_OPTIONS_allow(self): """ RenderMixin.http_OPTIONS() Allow header indicates allowed methods """ response = generateResponse("OPTIONS") self.assertEquals( set(response.headers.getHeader("allow")), self._my_allowed_methods ) def test_TRACE_status(self): """ RenderMixin.http_TRACE() Response code is OK """ response = generateResponse("TRACE") self.assertEquals(response.code, responsecode.OK) def test_TRACE_body(self): """ RenderMixin.http_TRACE() Check body for traciness """ raise NotImplementedError() test_TRACE_body.todo = "Someone should write this test" def test_HEAD_status(self): """ RenderMixin.http_HEAD() Response code is OK """ response = generateResponse("HEAD") self.assertEquals(response.code, responsecode.OK) def test_HEAD_body(self): """ RenderMixin.http_HEAD() Check body is empty """ response = generateResponse("HEAD") self.assertEquals(response.stream.length, 0) test_HEAD_body.todo = ( "http_HEAD is implemented in a goober way that " "relies on the server code to clean up after it." ) def test_GET_status(self): """ RenderMixin.http_GET() Response code is OK """ response = generateResponse("GET") self.assertEquals(response.code, responsecode.OK) def test_GET_body(self): """ RenderMixin.http_GET() Check body is empty """ response = generateResponse("GET") self.assertEquals( str(response.stream.read()), TestResource.renderOutput ) class ResourceTestCase (unittest.TestCase): """ Test Resource. """ def test_addSlash(self): # I think this would include a test of http_GET() raise NotImplementedError() test_addSlash.todo = "Someone should write this test" def test_locateChild(self): raise NotImplementedError() test_locateChild.todo = "Someone should write this test" def test_child_nonsense(self): raise NotImplementedError() test_child_nonsense.todo = "Someone should write this test" class PostableResourceTestCase (unittest.TestCase): """ Test PostableResource. """ def test_POST(self): raise NotImplementedError() test_POST.todo = "Someone should write this test" class LeafResourceTestCase (unittest.TestCase): """ Test LeafResource. """ def test_locateChild(self): resource = LeafResource() child, segments = ( resource.locateChild( SimpleRequest(Site(resource), "GET", "/"), ("", "foo"), ) ) self.assertEquals(child, resource) self.assertEquals(segments, StopTraversal) class WrapperResourceTestCase (unittest.TestCase): """ Test WrapperResource. """ def test_hook(self): raise NotImplementedError() test_hook.todo = "Someone should write this test" TwistedWeb2-8.1.0/twisted/web2/test/simple_client.py0000644000175000017500000000156110151165751021044 0ustar dokodokoimport socket, sys test_type = sys.argv[1] port = int(sys.argv[2]) socket_type = sys.argv[3] s = socket.socket(socket.AF_INET) s.connect(("127.0.0.1", port)) s.setsockopt(socket.SOL_SOCKET, socket.SO_RCVBUF, 40000) if socket_type == 'ssl': s2 = socket.ssl(s) send=s2.write recv=s2.read else: send=s.send recv=s.recv print >> sys.stderr, ">> Making %s request to port %d" % (socket_type, port) send("GET /error HTTP/1.0\r\n") send("Host: localhost\r\n") if test_type == "lingeringClose": print >> sys.stderr, ">> Sending lots of data" send("Content-Length: 1000000\r\n\r\n") send("X"*1000000) else: send('\r\n') #import time #time.sleep(5) print >> sys.stderr, ">> Getting data" data='' while len(data) < 299999: try: x=recv(10000) except: break if x == '': break data+=x sys.stdout.write(data) TwistedWeb2-8.1.0/twisted/web2/filter/0000755000175000017500000000000011014056216016140 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/filter/location.py0000644000175000017500000000170510505062041020322 0ustar dokodokofrom twisted.web2 import responsecode import urlparse __all__ = ['addLocation'] def addLocation(request, location): """ Add a C{location} header to the response if the response status is CREATED. @param request: L{IRequest} the request being processed @param location: the URI to use in the C{location} header """ def locationFilter(request, response): if (response.code == responsecode.CREATED): # # Check to see whether we have an absolute URI or not. # If not, have the request turn it into an absolute URI. # (scheme, host, path, params, querystring, fragment) = urlparse.urlparse(location) if scheme == "": uri = request.unparseURL(path=location) else: uri = location response.headers.setHeader("location", uri) return response request.addResponseFilter(locationFilter) TwistedWeb2-8.1.0/twisted/web2/filter/gzip.py0000644000175000017500000000541410340730377017477 0ustar dokodokofrom __future__ import generators import struct import zlib from twisted.web2 import stream # TODO: ungzip (can any browsers actually generate gzipped # upload data?) But it's necessary for client anyways. def gzipStream(input, compressLevel=6): crc, size = zlib.crc32(''), 0 # magic header, compression method, no flags header = '\037\213\010\000' # timestamp header += struct.pack('= size: end = size - 1 if start >= size: raise UnsatisfiableRangeRequest return start,end def makeUnsatisfiable(request, oldresponse): if request.headers.hasHeader('if-range'): return oldresponse # Return resource instead of error response = http.Response(responsecode.REQUESTED_RANGE_NOT_SATISFIABLE) response.headers.setHeader("content-range", ('bytes', None, None, oldresponse.stream.length)) return response def makeSegment(inputStream, lastOffset, start, end): offset = start - lastOffset length = end + 1 - start if offset != 0: before, inputStream = inputStream.split(offset) before.close() return inputStream.split(length) def rangefilter(request, oldresponse): if oldresponse.stream is None: return oldresponse size = oldresponse.stream.length if size is None: # Does not deal with indeterminate length outputs return oldresponse oldresponse.headers.setHeader('accept-ranges',('bytes',)) rangespec = request.headers.getHeader('range') # If we've got a range header and the If-Range header check passes, and # the range type is bytes, do a partial response. if (rangespec is not None and http.checkIfRange(request, oldresponse) and rangespec[0] == 'bytes'): # If it's a single range, return a simple response if len(rangespec[1]) == 1: try: start,end = canonicalizeRange(rangespec[1][0], size) except UnsatisfiableRangeRequest: return makeUnsatisfiable(request, oldresponse) response = http.Response(responsecode.PARTIAL_CONTENT, oldresponse.headers) response.headers.setHeader('content-range',('bytes',start, end, size)) content, after = makeSegment(oldresponse.stream, 0, start, end) after.close() response.stream = content return response else: # Return a multipart/byteranges response lastOffset = -1 offsetList = [] for arange in rangespec[1]: try: start,end = canonicalizeRange(arange, size) except UnsatisfiableRangeRequest: continue if start <= lastOffset: # Stupid client asking for out-of-order or overlapping ranges, PUNT! return oldresponse offsetList.append((start,end)) lastOffset = end if not offsetList: return makeUnsatisfiable(request, oldresponse) content_type = oldresponse.headers.getRawHeaders('content-type') boundary = "%x%x" % (int(time.time()*1000000), os.getpid()) response = http.Response(responsecode.PARTIAL_CONTENT, oldresponse.headers) response.headers.setHeader('content-type', http_headers.MimeType('multipart', 'byteranges', [('boundary', boundary)])) response.stream = out = stream.CompoundStream() lastOffset = 0 origStream = oldresponse.stream headerString = "\r\n--%s" % boundary if len(content_type) == 1: headerString+='\r\nContent-Type: %s' % content_type[0] headerString+="\r\nContent-Range: %s\r\n\r\n" for start,end in offsetList: out.addStream(headerString % http_headers.generateContentRange(('bytes', start, end, size))) content, origStream = makeSegment(origStream, lastOffset, start, end) lastOffset = end + 1 out.addStream(content) origStream.close() out.addStream("\r\n--%s--\r\n" % boundary) return response else: return oldresponse __all__ = ['rangefilter'] TwistedWeb2-8.1.0/twisted/web2/filter/__init__.py0000644000175000017500000000024010261613314020246 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_cgi -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """ Output filters. """ TwistedWeb2-8.1.0/twisted/web2/auth/0000755000175000017500000000000011014056216015614 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/auth/interfaces.py0000644000175000017500000000357710563450446020340 0ustar dokodokofrom zope.interface import Interface, Attribute class ICredentialFactory(Interface): """ A credential factory provides state between stages in HTTP authentication. It is ultimately in charge of creating an ICredential for the specified scheme, that will be used by cred to complete authentication. """ scheme = Attribute(("string indicating the authentication scheme " "this factory is associated with.")) def getChallenge(peer): """ Generate a challenge the client may respond to. @type peer: L{twisted.internet.interfaces.IAddress} @param peer: The client's address @rtype: C{dict} @return: dictionary of challenge arguments """ def decode(response, request): """ Create a credentials object from the given response. May raise twisted.cred.error.LoginFailed if the response is invalid. @type response: C{str} @param response: scheme specific response string @type request: L{twisted.web2.server.Request} @param request: the request being processed @return: ICredentials """ class IAuthenticatedRequest(Interface): """ A request that has been authenticated with the use of Cred, and holds a reference to the avatar returned by portal.login """ avatarInterface = Attribute(("The credential interface implemented by " "the avatar")) avatar = Attribute("The application specific avatar returned by " "the application's realm") class IHTTPUser(Interface): """ A generic interface that can implemented by an avatar to provide access to the username used when authenticating. """ username = Attribute(("A string representing the username portion of " "the credentials used for authentication"))TwistedWeb2-8.1.0/twisted/web2/auth/digest.py0000644000175000017500000002440210624150303017445 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_httpauth -*- """ Implementation of RFC2617: HTTP Digest Authentication http://www.faqs.org/rfcs/rfc2617.html """ import time from twisted.cred import credentials, error from zope.interface import implements, Interface from twisted.web2.auth.interfaces import ICredentialFactory import md5, sha import random, sys # The digest math algorithms = { 'md5': md5.new, 'md5-sess': md5.new, 'sha': sha.new, } # DigestCalcHA1 def calcHA1( pszAlg, pszUserName, pszRealm, pszPassword, pszNonce, pszCNonce, preHA1=None ): """ @param pszAlg: The name of the algorithm to use to calculate the digest. Currently supported are md5 md5-sess and sha. @param pszUserName: The username @param pszRealm: The realm @param pszPassword: The password @param pszNonce: The nonce @param pszCNonce: The cnonce @param preHA1: If available this is a str containing a previously calculated HA1 as a hex string. If this is given then the values for pszUserName, pszRealm, and pszPassword are ignored. """ if (preHA1 and (pszUserName or pszRealm or pszPassword)): raise TypeError(("preHA1 is incompatible with the pszUserName, " "pszRealm, and pszPassword arguments")) if preHA1 is None: # We need to calculate the HA1 from the username:realm:password m = algorithms[pszAlg]() m.update(pszUserName) m.update(":") m.update(pszRealm) m.update(":") m.update(pszPassword) HA1 = m.digest() else: # We were given a username:realm:password HA1 = preHA1.decode('hex') if pszAlg == "md5-sess": m = algorithms[pszAlg]() m.update(HA1) m.update(":") m.update(pszNonce) m.update(":") m.update(pszCNonce) HA1 = m.digest() return HA1.encode('hex') # DigestCalcResponse def calcResponse( HA1, algo, pszNonce, pszNonceCount, pszCNonce, pszQop, pszMethod, pszDigestUri, pszHEntity, ): m = algorithms[algo]() m.update(pszMethod) m.update(":") m.update(pszDigestUri) if pszQop == "auth-int": m.update(":") m.update(pszHEntity) HA2 = m.digest().encode('hex') m = algorithms[algo]() m.update(HA1) m.update(":") m.update(pszNonce) m.update(":") if pszNonceCount and pszCNonce: # pszQop: m.update(pszNonceCount) m.update(":") m.update(pszCNonce) m.update(":") m.update(pszQop) m.update(":") m.update(HA2) respHash = m.digest().encode('hex') return respHash class IUsernameDigestHash(Interface): """ This credential is used when a CredentialChecker has access to the hash of the username:realm:password as in an Apache .htdigest file. """ def checkHash(self, digestHash): """ @param digestHash: The hashed username:realm:password to check against. @return: a deferred which becomes, or a boolean indicating if the hash matches. """ class DigestedCredentials: """Yet Another Simple HTTP Digest authentication scheme""" implements(credentials.IUsernameHashedPassword, IUsernameDigestHash) def __init__(self, username, method, realm, fields): self.username = username self.method = method self.realm = realm self.fields = fields def checkPassword(self, password): response = self.fields.get('response') uri = self.fields.get('uri') nonce = self.fields.get('nonce') cnonce = self.fields.get('cnonce') nc = self.fields.get('nc') algo = self.fields.get('algorithm', 'md5').lower() qop = self.fields.get('qop', 'auth') expected = calcResponse( calcHA1(algo, self.username, self.realm, password, nonce, cnonce), algo, nonce, nc, cnonce, qop, self.method, uri, None ) return expected == response def checkHash(self, digestHash): response = self.fields.get('response') uri = self.fields.get('uri') nonce = self.fields.get('nonce') cnonce = self.fields.get('cnonce') nc = self.fields.get('nc') algo = self.fields.get('algorithm', 'md5').lower() qop = self.fields.get('qop', 'auth') expected = calcResponse( calcHA1(algo, None, None, None, nonce, cnonce, preHA1=digestHash), algo, nonce, nc, cnonce, qop, self.method, uri, None ) return expected == response class DigestCredentialFactory(object): """ Support for RFC2617 HTTP Digest Authentication @cvar CHALLENGE_LIFETIME_SECS: The number of seconds for which an opaque should be valid. @ivar privateKey: A random string used for generating the secure opaque. """ implements(ICredentialFactory) CHALLENGE_LIFETIME_SECS = 15 * 60 # 15 minutes scheme = "digest" def __init__(self, algorithm, realm): """ @type algorithm: C{str} @param algorithm: case insensitive string that specifies the hash algorithm used, should be either, md5, md5-sess or sha @type realm: C{str} @param realm: case sensitive string that specifies the realm portion of the challenge """ self.algorithm = algorithm self.realm = realm c = tuple([random.randrange(sys.maxint) for _ in range(3)]) self.privateKey = '%d%d%d' % c def generateNonce(self): c = tuple([random.randrange(sys.maxint) for _ in range(3)]) c = '%d%d%d' % c return c def _getTime(self): """ Parameterize the time based seed used in generateOpaque so we can deterministically unittest it's behavior. """ return time.time() def generateOpaque(self, nonce, clientip): """ Generate an opaque to be returned to the client. This should be a unique string that can be returned to us and verified. """ # Now, what we do is encode the nonce, client ip and a timestamp # in the opaque value with a suitable digest key = "%s,%s,%s" % (nonce, clientip, str(int(self._getTime()))) digest = md5.new(key + self.privateKey).hexdigest() ekey = key.encode('base64') return "%s-%s" % (digest, ekey.strip('\n')) def verifyOpaque(self, opaque, nonce, clientip): """ Given the opaque and nonce from the request, as well as the clientip that made the request, verify that the opaque was generated by us. And that it's not too old. @param opaque: The opaque value from the Digest response @param nonce: The nonce value from the Digest response @param clientip: The remote IP address of the client making the request @return: C{True} if the opaque was successfully verified. @raise error.LoginFailed: if C{opaque} could not be parsed or contained the wrong values. """ # First split the digest from the key opaqueParts = opaque.split('-') if len(opaqueParts) != 2: raise error.LoginFailed('Invalid response, invalid opaque value') # Verify the key key = opaqueParts[1].decode('base64') keyParts = key.split(',') if len(keyParts) != 3: raise error.LoginFailed('Invalid response, invalid opaque value') if keyParts[0] != nonce: raise error.LoginFailed( 'Invalid response, incompatible opaque/nonce values') if keyParts[1] != clientip: raise error.LoginFailed( 'Invalid response, incompatible opaque/client values') if (int(self._getTime()) - int(keyParts[2]) > DigestCredentialFactory.CHALLENGE_LIFETIME_SECS): raise error.LoginFailed( 'Invalid response, incompatible opaque/nonce too old') # Verify the digest digest = md5.new(key + self.privateKey).hexdigest() if digest != opaqueParts[0]: raise error.LoginFailed('Invalid response, invalid opaque value') return True def getChallenge(self, peer): """ Generate the challenge for use in the WWW-Authenticate header @param peer: The L{IAddress} of the requesting client. @return: The C{dict} that can be used to generate a WWW-Authenticate header. """ c = self.generateNonce() o = self.generateOpaque(c, peer.host) return {'nonce': c, 'opaque': o, 'qop': 'auth', 'algorithm': self.algorithm, 'realm': self.realm} def decode(self, response, request): """ Decode the given response and attempt to generate a L{DigestedCredentials} from it. @type response: C{str} @param response: A string of comma seperated key=value pairs @type request: L{twisted.web2.server.Request} @param request: the request being processed @return: L{DigestedCredentials} @raise: L{error.LoginFailed} if the response does not contain a username, a nonce, an opaque, or if the opaque is invalid. """ def unq(s): if s[0] == s[-1] == '"': return s[1:-1] return s response = ' '.join(response.splitlines()) parts = response.split(',') auth = {} for (k, v) in [p.split('=', 1) for p in parts]: auth[k.strip()] = unq(v.strip()) username = auth.get('username') if not username: raise error.LoginFailed('Invalid response, no username given.') if 'opaque' not in auth: raise error.LoginFailed('Invalid response, no opaque given.') if 'nonce' not in auth: raise error.LoginFailed('Invalid response, no nonce given.') # Now verify the nonce/opaque values for this client if self.verifyOpaque(auth.get('opaque'), auth.get('nonce'), request.remoteAddr.host): return DigestedCredentials(username, request.method, self.realm, auth) TwistedWeb2-8.1.0/twisted/web2/auth/__init__.py0000644000175000017500000000010110377006011017714 0ustar dokodoko""" Client and server implementations of http authentication """ TwistedWeb2-8.1.0/twisted/web2/auth/wrapper.py0000644000175000017500000001620610573065164017666 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_httpauth -*- """ Wrapper Resources for rfc2617 HTTP Auth. """ from zope.interface import implements, directlyProvides from twisted.cred import error, credentials from twisted.python import failure from twisted.web2 import responsecode from twisted.web2 import http from twisted.web2 import iweb from twisted.web2.auth.interfaces import IAuthenticatedRequest class UnauthorizedResponse(http.StatusResponse): """A specialized response class for generating www-authenticate headers from the given L{CredentialFactory} instances """ def __init__(self, factories, remoteAddr=None): """ @param factories: A L{dict} of {'scheme': ICredentialFactory} @param remoteAddr: An L{IAddress} for the connecting client. """ super(UnauthorizedResponse, self).__init__( responsecode.UNAUTHORIZED, "You are not authorized to access this resource.") authHeaders = [] for factory in factories.itervalues(): authHeaders.append((factory.scheme, factory.getChallenge(remoteAddr))) self.headers.setHeader('www-authenticate', authHeaders) class HTTPAuthResource(object): """I wrap a resource to prevent it being accessed unless the authentication can be completed using the credential factory, portal, and interfaces specified. """ implements(iweb.IResource) def __init__(self, wrappedResource, credentialFactories, portal, interfaces): """ @param wrappedResource: A L{twisted.web2.iweb.IResource} to be returned from locateChild and render upon successful authentication. @param credentialFactories: A list of instances that implement L{ICredentialFactory}. @type credentialFactories: L{list} @param portal: Portal to handle logins for this resource. @type portal: L{twisted.cred.portal.Portal} @param interfaces: the interfaces that are allowed to log in via the given portal @type interfaces: L{tuple} """ self.wrappedResource = wrappedResource self.credentialFactories = dict([(factory.scheme, factory) for factory in credentialFactories]) self.portal = portal self.interfaces = interfaces def _loginSucceeded(self, avatar, request): """ Callback for successful login. @param avatar: A tuple of the form (interface, avatar) as returned by your realm. @param request: L{IRequest} that encapsulates this auth attempt. @return: the IResource in C{self.wrappedResource} """ request.avatarInterface, request.avatar = avatar directlyProvides(request, IAuthenticatedRequest) def _addAuthenticateHeaders(request, response): """ A response filter that adds www-authenticate headers to an outgoing response if it's code is UNAUTHORIZED (401) and it does not already have them. """ if response.code == responsecode.UNAUTHORIZED: if not response.headers.hasHeader('www-authenticate'): newResp = UnauthorizedResponse(self.credentialFactories, request.remoteAddr) response.headers.setHeader( 'www-authenticate', newResp.headers.getHeader('www-authenticate')) return response _addAuthenticateHeaders.handleErrors = True request.addResponseFilter(_addAuthenticateHeaders) return self.wrappedResource def _loginFailed(self, result, request): """ Errback for failed login. @param result: L{Failure} returned by portal.login @param request: L{IRequest} that encapsulates this auth attempt. @return: A L{Failure} containing an L{HTTPError} containing the L{UnauthorizedResponse} if C{result} is an L{UnauthorizedLogin} or L{UnhandledCredentials} error """ result.trap(error.UnauthorizedLogin, error.UnhandledCredentials) return failure.Failure( http.HTTPError( UnauthorizedResponse( self.credentialFactories, request.remoteAddr))) def login(self, factory, response, request): """ @param factory: An L{ICredentialFactory} that understands the given response. @param response: The client's authentication response as a string. @param request: The request that prompted this authentication attempt. @return: A L{Deferred} that fires with the wrappedResource on success or a failure containing an L{UnauthorizedResponse} """ try: creds = factory.decode(response, request) except error.LoginFailed: raise http.HTTPError(UnauthorizedResponse( self.credentialFactories, request.remoteAddr)) return self.portal.login(creds, None, *self.interfaces ).addCallbacks(self._loginSucceeded, self._loginFailed, (request,), None, (request,), None) def authenticate(self, request): """ Attempt to authenticate the givin request @param request: An L{IRequest} to be authenticated. """ authHeader = request.headers.getHeader('authorization') if authHeader is None: return self.portal.login(credentials.Anonymous(), None, *self.interfaces ).addCallbacks(self._loginSucceeded, self._loginFailed, (request,), None, (request,), None) elif authHeader[0] not in self.credentialFactories: raise http.HTTPError(UnauthorizedResponse( self.credentialFactories, request.remoteAddr)) else: return self.login(self.credentialFactories[authHeader[0]], authHeader[1], request) def locateChild(self, request, seg): """ Authenticate the request then return the C{self.wrappedResource} and the unmodified segments. """ return self.authenticate(request), seg def renderHTTP(self, request): """ Authenticate the request then return the result of calling renderHTTP on C{self.wrappedResource} """ def _renderResource(resource): return resource.renderHTTP(request) d = self.authenticate(request) d.addCallback(_renderResource) return d TwistedWeb2-8.1.0/twisted/web2/auth/basic.py0000644000175000017500000000155710563450446017272 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_httpauth -*- from twisted.cred import credentials, error from twisted.web2.auth.interfaces import ICredentialFactory from zope.interface import implements class BasicCredentialFactory(object): """ Credential Factory for HTTP Basic Authentication """ implements(ICredentialFactory) scheme = 'basic' def __init__(self, realm): self.realm = realm def getChallenge(self, peer): return {'realm': self.realm} def decode(self, response, request): try: creds = (response + '===').decode('base64') except: raise error.LoginFailed('Invalid credentials') creds = creds.split(':', 1) if len(creds) == 2: return credentials.UsernamePassword(*creds) else: raise error.LoginFailed('Invalid credentials') TwistedWeb2-8.1.0/twisted/web2/responsecode.py0000644000175000017500000000743610403341421017724 0ustar dokodoko# -*- test-case-name: twisted.web2.test -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. CONTINUE = 100 SWITCHING = 101 OK = 200 CREATED = 201 ACCEPTED = 202 NON_AUTHORITATIVE_INFORMATION = 203 NO_CONTENT = 204 RESET_CONTENT = 205 PARTIAL_CONTENT = 206 MULTI_STATUS = 207 MULTIPLE_CHOICE = 300 MOVED_PERMANENTLY = 301 FOUND = 302 SEE_OTHER = 303 NOT_MODIFIED = 304 USE_PROXY = 305 TEMPORARY_REDIRECT = 307 BAD_REQUEST = 400 UNAUTHORIZED = 401 PAYMENT_REQUIRED = 402 FORBIDDEN = 403 NOT_FOUND = 404 NOT_ALLOWED = 405 NOT_ACCEPTABLE = 406 PROXY_AUTH_REQUIRED = 407 REQUEST_TIMEOUT = 408 CONFLICT = 409 GONE = 410 LENGTH_REQUIRED = 411 PRECONDITION_FAILED = 412 REQUEST_ENTITY_TOO_LARGE = 413 REQUEST_URI_TOO_LONG = 414 UNSUPPORTED_MEDIA_TYPE = 415 REQUESTED_RANGE_NOT_SATISFIABLE = 416 EXPECTATION_FAILED = 417 UNPROCESSABLE_ENTITY = 422 # RFC 2518 LOCKED = 423 # RFC 2518 FAILED_DEPENDENCY = 424 # RFC 2518 INTERNAL_SERVER_ERROR = 500 NOT_IMPLEMENTED = 501 BAD_GATEWAY = 502 SERVICE_UNAVAILABLE = 503 GATEWAY_TIMEOUT = 504 HTTP_VERSION_NOT_SUPPORTED = 505 INSUFFICIENT_STORAGE_SPACE = 507 NOT_EXTENDED = 510 RESPONSES = { # 100 CONTINUE: "Continue", SWITCHING: "Switching Protocols", # 200 OK: "OK", CREATED: "Created", ACCEPTED: "Accepted", NON_AUTHORITATIVE_INFORMATION: "Non-Authoritative Information", NO_CONTENT: "No Content", RESET_CONTENT: "Reset Content.", PARTIAL_CONTENT: "Partial Content", MULTI_STATUS: "Multi-Status", # 300 MULTIPLE_CHOICE: "Multiple Choices", MOVED_PERMANENTLY: "Moved Permanently", FOUND: "Found", SEE_OTHER: "See Other", NOT_MODIFIED: "Not Modified", USE_PROXY: "Use Proxy", # 306 unused TEMPORARY_REDIRECT: "Temporary Redirect", # 400 BAD_REQUEST: "Bad Request", UNAUTHORIZED: "Unauthorized", PAYMENT_REQUIRED: "Payment Required", FORBIDDEN: "Forbidden", NOT_FOUND: "Not Found", NOT_ALLOWED: "Method Not Allowed", NOT_ACCEPTABLE: "Not Acceptable", PROXY_AUTH_REQUIRED: "Proxy Authentication Required", REQUEST_TIMEOUT: "Request Time-out", CONFLICT: "Conflict", GONE: "Gone", LENGTH_REQUIRED: "Length Required", PRECONDITION_FAILED: "Precondition Failed", REQUEST_ENTITY_TOO_LARGE: "Request Entity Too Large", REQUEST_URI_TOO_LONG: "Request-URI Too Long", UNSUPPORTED_MEDIA_TYPE: "Unsupported Media Type", REQUESTED_RANGE_NOT_SATISFIABLE: "Requested Range not satisfiable", EXPECTATION_FAILED: "Expectation Failed", UNPROCESSABLE_ENTITY: "Unprocessable Entity", LOCKED: "Locked", FAILED_DEPENDENCY: "Failed Dependency", # 500 INTERNAL_SERVER_ERROR: "Internal Server Error", NOT_IMPLEMENTED: "Not Implemented", BAD_GATEWAY: "Bad Gateway", SERVICE_UNAVAILABLE: "Service Unavailable", GATEWAY_TIMEOUT: "Gateway Time-out", HTTP_VERSION_NOT_SUPPORTED: "HTTP Version not supported", INSUFFICIENT_STORAGE_SPACE: "Insufficient Storage Space", NOT_EXTENDED: "Not Extended" } # No __all__ necessary -- everything is exported TwistedWeb2-8.1.0/twisted/web2/error.py0000644000175000017500000001125210340730377016367 0ustar dokodoko# Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """ Default error output filter for twisted.web2. """ from twisted.web2 import stream, http_headers from twisted.web2.responsecode import * # 300 - Should include entity with choices # 301 - # 304 - Must include Date, ETag, Content-Location, Expires, Cache-Control, Vary. # # 401 - Must include WWW-Authenticate. # 405 - Must include Allow. # 406 - Should include entity describing allowable characteristics # 407 - Must include Proxy-Authenticate # 413 - May include Retry-After # 416 - Should include Content-Range # 503 - Should include Retry-After ERROR_MESSAGES = { # 300 # no MULTIPLE_CHOICES MOVED_PERMANENTLY: 'The document has permanently moved here.', FOUND: 'The document has temporarily moved here.', SEE_OTHER: 'The results are available here.', # no NOT_MODIFIED USE_PROXY: "Access to this resource must be through the proxy %(location)s.", # 306 unused TEMPORARY_REDIRECT: 'The document has temporarily moved here.', # 400 BAD_REQUEST: "Your browser sent an invalid request.", UNAUTHORIZED: "You are not authorized to view the resource at %(uri)s. Perhaps you entered a wrong password, or perhaps your browser doesn't support authentication.", PAYMENT_REQUIRED: "Payment Required (useful result code, this...).", FORBIDDEN: "You don't have permission to access %(uri)s.", NOT_FOUND: "The resource %(uri)s cannot be found.", NOT_ALLOWED: "The requested method %(method)s is not supported by %(uri)s.", NOT_ACCEPTABLE: "No representation of %(uri)s that is acceptable to your client could be found.", PROXY_AUTH_REQUIRED: "You are not authorized to view the resource at %(uri)s. Perhaps you entered a wrong password, or perhaps your browser doesn't support authentication.", REQUEST_TIMEOUT: "Server timed out waiting for your client to finish sending the HTTP request.", CONFLICT: "Conflict (?)", GONE: "The resource %(uri)s has been permanently removed.", LENGTH_REQUIRED: "The resource %(uri)s requires a Content-Length header.", PRECONDITION_FAILED: "A precondition evaluated to false.", REQUEST_ENTITY_TOO_LARGE: "The provided request entity data is too longer than the maximum for the method %(method)s at %(uri)s.", REQUEST_URI_TOO_LONG: "The request URL is longer than the maximum on this server.", UNSUPPORTED_MEDIA_TYPE: "The provided request data has a format not understood by the resource at %(uri)s.", REQUESTED_RANGE_NOT_SATISFIABLE: "None of the ranges given in the Range request header are satisfiable by the resource %(uri)s.", EXPECTATION_FAILED: "The server does support one of the expectations given in the Expect header.", # 500 INTERNAL_SERVER_ERROR: "An internal error occurred trying to process your request. Sorry.", NOT_IMPLEMENTED: "Some functionality requested is not implemented on this server.", BAD_GATEWAY: "An upstream server returned an invalid response.", SERVICE_UNAVAILABLE: "This server cannot service your request becaues it is overloaded.", GATEWAY_TIMEOUT: "An upstream server is not responding.", HTTP_VERSION_NOT_SUPPORTED: "HTTP Version not supported.", INSUFFICIENT_STORAGE_SPACE: "There is insufficient storage space available to perform that request.", NOT_EXTENDED: "This server does not support the a mandatory extension requested." } # Is there a good place to keep this function? def _escape(original): if original is None: return None return original.replace("&", "&").replace("<", "<").replace(">", ">").replace("\"", """) def defaultErrorHandler(request, response): if response.stream is not None: # Already got an error message return response if response.code < 300: # We only do error messages return response message = ERROR_MESSAGES.get(response.code, None) if message is None: # No message specified for that code return response message = message % { 'uri':_escape(request.uri), 'location':_escape(response.headers.getHeader('location')), 'method':_escape(request.method) } title = RESPONSES.get(response.code, "") body = ("%d %s" "

%s

%s") % ( response.code, title, title, message) response.headers.setHeader("content-type", http_headers.MimeType('text', 'html')) response.stream = stream.MemoryStream(body) return response defaultErrorHandler.handleErrors = True __all__ = ['defaultErrorHandler',] TwistedWeb2-8.1.0/twisted/web2/stream.py0000644000175000017500000010367110514325306016533 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_stream -*- """ The stream module provides a simple abstraction of streaming data. While Twisted already has some provisions for handling this in its Producer/Consumer model, the rather complex interactions between producer and consumer makes it difficult to implement something like the CompoundStream object. Thus, this API. The IStream interface is very simple. It consists of two methods: read, and close. The read method should either return some data, None if there is no data left to read, or a Deferred. Close frees up any underlying resources and causes read to return None forevermore. IByteStream adds a bit more to the API: 1) read is required to return objects conforming to the buffer interface. 2) .length, which may either an integer number of bytes remaining, or None if unknown 3) .split(position). Split takes a position, and splits the stream in two pieces, returning the two new streams. Using the original stream after calling split is not allowed. There are two builtin source stream classes: FileStream and MemoryStream. The first produces data from a file object, the second from a buffer in memory. Any number of these can be combined into one stream with the CompoundStream object. Then, to interface with other parts of Twisted, there are two transcievers: StreamProducer and ProducerStream. The first takes a stream and turns it into an IPushProducer, which will write to a consumer. The second is a consumer which is a stream, so that other producers can write to it. """ from __future__ import generators import copy, os, types, sys from zope.interface import Interface, Attribute, implements from twisted.internet.defer import Deferred from twisted.internet import interfaces as ti_interfaces, defer, reactor, protocol, error as ti_error from twisted.python import components, log from twisted.python.failure import Failure # Python 2.4.2 (only) has a broken mmap that leaks a fd every time you call it. if sys.version_info[0:3] != (2,4,2): try: import mmap except ImportError: mmap = None else: mmap = None ############################## #### Interfaces #### ############################## class IStream(Interface): """A stream of arbitrary data.""" def read(): """Read some data. Returns some object representing the data. If there is no more data available, returns None. Can also return a Deferred resulting in one of the above. Errors may be indicated by exception or by a Deferred of a Failure. """ def close(): """Prematurely close. Should also cause further reads to return None.""" class IByteStream(IStream): """A stream which is of bytes.""" length = Attribute("""How much data is in this stream. Can be None if unknown.""") def read(): """Read some data. Returns an object conforming to the buffer interface, or if there is no more data available, returns None. Can also return a Deferred resulting in one of the above. Errors may be indicated by exception or by a Deferred of a Failure. """ def split(point): """Split this stream into two, at byte position 'point'. Returns a tuple of (before, after). After calling split, no other methods should be called on this stream. Doing so will have undefined behavior. If you cannot implement split easily, you may implement it as:: return fallbackSplit(self, point) """ def close(): """Prematurely close this stream. Should also cause further reads to return None. Additionally, .length should be set to 0. """ class ISendfileableStream(Interface): def read(sendfile=False): """ Read some data. If sendfile == False, returns an object conforming to the buffer interface, or else a Deferred. If sendfile == True, returns either the above, or a SendfileBuffer. """ class SimpleStream(object): """Superclass of simple streams with a single buffer and a offset and length into that buffer.""" implements(IByteStream) length = None start = None def read(self): return None def close(self): self.length = 0 def split(self, point): if self.length is not None: if point > self.length: raise ValueError("split point (%d) > length (%d)" % (point, self.length)) b = copy.copy(self) self.length = point if b.length is not None: b.length -= point b.start += point return (self, b) ############################## #### FileStream #### ############################## # maximum mmap size MMAP_LIMIT = 4*1024*1024 # minimum mmap size MMAP_THRESHOLD = 8*1024 # maximum sendfile length SENDFILE_LIMIT = 16777216 # minimum sendfile size SENDFILE_THRESHOLD = 256 def mmapwrapper(*args, **kwargs): """ Python's mmap call sucks and ommitted the "offset" argument for no discernable reason. Replace this with a mmap module that has offset. """ offset = kwargs.get('offset', None) if offset in [None, 0]: if 'offset' in kwargs: del kwargs['offset'] else: raise mmap.error("mmap: Python sucks and does not support offset.") return mmap.mmap(*args, **kwargs) class FileStream(SimpleStream): implements(ISendfileableStream) """A stream that reads data from a file. File must be a normal file that supports seek, (e.g. not a pipe or device or socket).""" # 65K, minus some slack CHUNK_SIZE = 2 ** 2 ** 2 ** 2 - 32 f = None def __init__(self, f, start=0, length=None, useMMap=bool(mmap)): """ Create the stream from file f. If you specify start and length, use only that portion of the file. """ self.f = f self.start = start if length is None: self.length = os.fstat(f.fileno()).st_size else: self.length = length self.useMMap = useMMap def read(self, sendfile=False): if self.f is None: return None length = self.length if length == 0: self.f = None return None if sendfile and length > SENDFILE_THRESHOLD: # XXX: Yay using non-existent sendfile support! # FIXME: if we return a SendfileBuffer, and then sendfile # fails, then what? Or, what if file is too short? readSize = min(length, SENDFILE_LIMIT) res = SendfileBuffer(self.f, self.start, readSize) self.length -= readSize self.start += readSize return res if self.useMMap and length > MMAP_THRESHOLD: readSize = min(length, MMAP_LIMIT) try: res = mmapwrapper(self.f.fileno(), readSize, access=mmap.ACCESS_READ, offset=self.start) #madvise(res, MADV_SEQUENTIAL) self.length -= readSize self.start += readSize return res except mmap.error: pass # Fall back to standard read. readSize = min(length, self.CHUNK_SIZE) self.f.seek(self.start) b = self.f.read(readSize) bytesRead = len(b) if not bytesRead: raise RuntimeError("Ran out of data reading file %r, expected %d more bytes" % (self.f, length)) else: self.length -= bytesRead self.start += bytesRead return b def close(self): self.f = None SimpleStream.close(self) components.registerAdapter(FileStream, file, IByteStream) ############################## #### MemoryStream #### ############################## class MemoryStream(SimpleStream): """A stream that reads data from a buffer object.""" def __init__(self, mem, start=0, length=None): """ Create the stream from buffer object mem. If you specify start and length, use only that portion of the buffer. """ self.mem = mem self.start = start if length is None: self.length = len(mem) - start else: if len(mem) < length: raise ValueError("len(mem) < start + length") self.length = length def read(self): if self.mem is None: return None if self.length == 0: result = None else: result = buffer(self.mem, self.start, self.length) self.mem = None self.length = 0 return result def close(self): self.mem = None SimpleStream.close(self) components.registerAdapter(MemoryStream, str, IByteStream) components.registerAdapter(MemoryStream, types.BufferType, IByteStream) ############################## #### CompoundStream #### ############################## class CompoundStream(object): """A stream which is composed of many other streams. Call addStream to add substreams. """ implements(IByteStream, ISendfileableStream) deferred = None length = 0 def __init__(self, buckets=()): self.buckets = [IByteStream(s) for s in buckets] def addStream(self, bucket): """Add a stream to the output""" bucket = IByteStream(bucket) self.buckets.append(bucket) if self.length is not None: if bucket.length is None: self.length = None else: self.length += bucket.length def read(self, sendfile=False): if self.deferred is not None: raise RuntimeError("Call to read while read is already outstanding") if not self.buckets: return None if sendfile and ISendfileableStream.providedBy(self.buckets[0]): try: result = self.buckets[0].read(sendfile) except: return self._gotFailure(Failure()) else: try: result = self.buckets[0].read() except: return self._gotFailure(Failure()) if isinstance(result, Deferred): self.deferred = result result.addCallbacks(self._gotRead, self._gotFailure, (sendfile,)) return result return self._gotRead(result, sendfile) def _gotFailure(self, f): self.deferred = None del self.buckets[0] self.close() return f def _gotRead(self, result, sendfile): self.deferred = None if result is None: del self.buckets[0] # Next bucket return self.read(sendfile) if self.length is not None: self.length -= len(result) return result def split(self, point): num = 0 origPoint = point for bucket in self.buckets: num+=1 if point == 0: b = CompoundStream() b.buckets = self.buckets[num:] del self.buckets[num:] return self,b if bucket.length is None: # Indeterminate length bucket. # give up and use fallback splitter. return fallbackSplit(self, origPoint) if point < bucket.length: before,after = bucket.split(point) b = CompoundStream() b.buckets = self.buckets[num:] b.buckets[0] = after del self.buckets[num+1:] self.buckets[num] = before return self,b point -= bucket.length def close(self): for bucket in self.buckets: bucket.close() self.buckets = [] self.length = 0 ############################## #### readStream #### ############################## class _StreamReader(object): """Process a stream's data using callbacks for data and stream finish.""" def __init__(self, stream, gotDataCallback): self.stream = stream self.gotDataCallback = gotDataCallback self.result = Deferred() def run(self): # self.result may be del'd in _read() result = self.result self._read() return result def _read(self): try: result = self.stream.read() except: self._gotError(Failure()) return if isinstance(result, Deferred): result.addCallbacks(self._gotData, self._gotError) else: self._gotData(result) def _gotError(self, failure): result = self.result del self.result, self.gotDataCallback, self.stream result.errback(failure) def _gotData(self, data): if data is None: result = self.result del self.result, self.gotDataCallback, self.stream result.callback(None) return try: self.gotDataCallback(data) except: self._gotError(Failure()) return reactor.callLater(0, self._read) def readStream(stream, gotDataCallback): """Pass a stream's data to a callback. Returns Deferred which will be triggered on finish. Errors in reading the stream or in processing it will be returned via this Deferred. """ return _StreamReader(stream, gotDataCallback).run() def readAndDiscard(stream): """Read all the data from the given stream, and throw it out. Returns Deferred which will be triggered on finish. """ return readStream(stream, lambda _: None) def readIntoFile(stream, outFile): """Read a stream and write it into a file. Returns Deferred which will be triggered on finish. """ def done(_): outFile.close() return _ return readStream(stream, outFile.write).addBoth(done) def connectStream(inputStream, factory): """Connect a protocol constructed from a factory to stream. Returns an output stream from the protocol. The protocol's transport will have a finish() method it should call when done writing. """ # XXX deal better with addresses p = factory.buildProtocol(None) out = ProducerStream() out.disconnecting = False # XXX for LineReceiver suckage p.makeConnection(out) readStream(inputStream, lambda _: p.dataReceived(_)).addCallbacks( lambda _: p.connectionLost(ti_error.ConnectionDone()), lambda _: p.connectionLost(_)) return out ############################## #### fallbackSplit #### ############################## def fallbackSplit(stream, point): after = PostTruncaterStream(stream, point) before = TruncaterStream(stream, point, after) return (before, after) class TruncaterStream(object): def __init__(self, stream, point, postTruncater): self.stream = stream self.length = point self.postTruncater = postTruncater def read(self): if self.length == 0: if self.postTruncater is not None: postTruncater = self.postTruncater self.postTruncater = None postTruncater.sendInitialSegment(self.stream.read()) self.stream = None return None result = self.stream.read() if isinstance(result, Deferred): return result.addCallback(self._gotRead) else: return self._gotRead(result) def _gotRead(self, data): if data is None: raise ValueError("Ran out of data for a split of a indeterminate length source") if self.length >= len(data): self.length -= len(data) return data else: before = buffer(data, 0, self.length) after = buffer(data, self.length) self.length = 0 if self.postTruncater is not None: postTruncater = self.postTruncater self.postTruncater = None postTruncater.sendInitialSegment(after) self.stream = None return before def split(self, point): if point > self.length: raise ValueError("split point (%d) > length (%d)" % (point, self.length)) post = PostTruncaterStream(self.stream, point) trunc = TruncaterStream(post, self.length - point, self.postTruncater) self.length = point self.postTruncater = post return self, trunc def close(self): if self.postTruncater is not None: self.postTruncater.notifyClosed(self) else: # Nothing cares about the rest of the stream self.stream.close() self.stream = None self.length = 0 class PostTruncaterStream(object): deferred = None sentInitialSegment = False truncaterClosed = None closed = False length = None def __init__(self, stream, point): self.stream = stream self.deferred = Deferred() if stream.length is not None: self.length = stream.length - point def read(self): if not self.sentInitialSegment: self.sentInitialSegment = True if self.truncaterClosed is not None: readAndDiscard(self.truncaterClosed) self.truncaterClosed = None return self.deferred return self.stream.read() def split(self, point): return fallbackSplit(self, point) def close(self): self.closed = True if self.truncaterClosed is not None: # have first half close itself self.truncaterClosed.postTruncater = None self.truncaterClosed.close() elif self.sentInitialSegment: # first half already finished up self.stream.close() self.deferred = None # Callbacks from TruncaterStream def sendInitialSegment(self, data): if self.closed: # First half finished, we don't want data. self.stream.close() self.stream = None if self.deferred is not None: if isinstance(data, Deferred): data.chainDeferred(self.deferred) else: self.deferred.callback(data) def notifyClosed(self, truncater): if self.closed: # we are closed, have first half really close truncater.postTruncater = None truncater.close() elif self.sentInitialSegment: # We are trying to read, read up first half readAndDiscard(truncater) else: # Idle, store closed info. self.truncaterClosed = truncater ######################################## #### ProducerStream/StreamProducer #### ######################################## class ProducerStream(object): """Turns producers into a IByteStream. Thus, implements IConsumer and IByteStream.""" implements(IByteStream, ti_interfaces.IConsumer) length = None closed = False failed = False producer = None producerPaused = False deferred = None bufferSize = 5 def __init__(self, length=None): self.buffer = [] self.length = length # IByteStream implementation def read(self): if self.buffer: return self.buffer.pop(0) elif self.closed: self.length = 0 if self.failed: f = self.failure del self.failure return defer.fail(f) return None else: deferred = self.deferred = Deferred() if self.producer is not None and (not self.streamingProducer or self.producerPaused): self.producerPaused = False self.producer.resumeProducing() return deferred def split(self, point): return fallbackSplit(self, point) def close(self): """Called by reader of stream when it is done reading.""" self.buffer=[] self.closed = True if self.producer is not None: self.producer.stopProducing() self.producer = None self.deferred = None # IConsumer implementation def write(self, data): if self.closed: return if self.deferred: deferred = self.deferred self.deferred = None deferred.callback(data) else: self.buffer.append(data) if(self.producer is not None and self.streamingProducer and len(self.buffer) > self.bufferSize): self.producer.pauseProducing() self.producerPaused = True def finish(self, failure=None): """Called by producer when it is done. If the optional failure argument is passed a Failure instance, the stream will return it as errback on next Deferred. """ self.closed = True if not self.buffer: self.length = 0 if self.deferred is not None: deferred = self.deferred self.deferred = None if failure is not None: self.failed = True deferred.errback(failure) else: deferred.callback(None) else: if failure is not None: self.failed = True self.failure = failure def registerProducer(self, producer, streaming): if self.producer is not None: raise RuntimeError("Cannot register producer %s, because producer %s was never unregistered." % (producer, self.producer)) if self.closed: producer.stopProducing() else: self.producer = producer self.streamingProducer = streaming if not streaming: producer.resumeProducing() def unregisterProducer(self): self.producer = None class StreamProducer(object): """A push producer which gets its data by reading a stream.""" implements(ti_interfaces.IPushProducer) deferred = None finishedCallback = None paused = False consumer = None def __init__(self, stream, enforceStr=True): self.stream = stream self.enforceStr = enforceStr def beginProducing(self, consumer): if self.stream is None: return defer.succeed(None) self.consumer = consumer finishedCallback = self.finishedCallback = Deferred() self.consumer.registerProducer(self, True) self.resumeProducing() return finishedCallback def resumeProducing(self): self.paused = False if self.deferred is not None: return try: data = self.stream.read() except: self.stopProducing(Failure()) return if isinstance(data, Deferred): self.deferred = data.addCallbacks(self._doWrite, self.stopProducing) else: self._doWrite(data) def _doWrite(self, data): if self.consumer is None: return if data is None: # The end. if self.consumer is not None: self.consumer.unregisterProducer() if self.finishedCallback is not None: self.finishedCallback.callback(None) self.finishedCallback = self.deferred = self.consumer = self.stream = None return self.deferred = None if self.enforceStr: # XXX: sucks that we have to do this. make transport.write(buffer) work! data = str(buffer(data)) self.consumer.write(data) if not self.paused: self.resumeProducing() def pauseProducing(self): self.paused = True def stopProducing(self, failure=ti_error.ConnectionLost()): if self.consumer is not None: self.consumer.unregisterProducer() if self.finishedCallback is not None: if failure is not None: self.finishedCallback.errback(failure) else: self.finishedCallback.callback(None) self.finishedCallback = None self.paused = True if self.stream is not None: self.stream.close() self.finishedCallback = self.deferred = self.consumer = self.stream = None ############################## #### ProcessStreamer #### ############################## class _ProcessStreamerProtocol(protocol.ProcessProtocol): def __init__(self, inputStream, outStream, errStream): self.inputStream = inputStream self.outStream = outStream self.errStream = errStream self.resultDeferred = defer.Deferred() def connectionMade(self): p = StreamProducer(self.inputStream) # if the process stopped reading from the input stream, # this is not an error condition, so it oughtn't result # in a ConnectionLost() from the input stream: p.stopProducing = lambda err=None: StreamProducer.stopProducing(p, err) d = p.beginProducing(self.transport) d.addCallbacks(lambda _: self.transport.closeStdin(), self._inputError) def _inputError(self, f): log.msg("Error in input stream for %r" % self.transport) log.err(f) self.transport.closeStdin() def outReceived(self, data): self.outStream.write(data) def errReceived(self, data): self.errStream.write(data) def outConnectionLost(self): self.outStream.finish() def errConnectionLost(self): self.errStream.finish() def processEnded(self, reason): self.resultDeferred.errback(reason) del self.resultDeferred class ProcessStreamer(object): """Runs a process hooked up to streams. Requires an input stream, has attributes 'outStream' and 'errStream' for stdout and stderr. outStream and errStream are public attributes providing streams for stdout and stderr of the process. """ def __init__(self, inputStream, program, args, env={}): self.outStream = ProducerStream() self.errStream = ProducerStream() self._protocol = _ProcessStreamerProtocol(IByteStream(inputStream), self.outStream, self.errStream) self._program = program self._args = args self._env = env def run(self): """Run the process. Returns Deferred which will eventually have errback for non-clean (exit code > 0) exit, with ProcessTerminated, or callback with None on exit code 0. """ # XXX what happens if spawn fails? reactor.spawnProcess(self._protocol, self._program, self._args, env=self._env) del self._env return self._protocol.resultDeferred.addErrback(lambda _: _.trap(ti_error.ProcessDone)) ############################## #### generatorToStream #### ############################## class _StreamIterator(object): done=False def __iter__(self): return self def next(self): if self.done: raise StopIteration return self.value wait=object() class _IteratorStream(object): length = None def __init__(self, fun, stream, args, kwargs): self._stream=stream self._streamIterator = _StreamIterator() self._gen = fun(self._streamIterator, *args, **kwargs) def read(self): try: val = self._gen.next() except StopIteration: return None else: if val is _StreamIterator.wait: newdata = self._stream.read() if isinstance(newdata, defer.Deferred): return newdata.addCallback(self._gotRead) else: return self._gotRead(newdata) return val def _gotRead(self, data): if data is None: self._streamIterator.done=True else: self._streamIterator.value=data return self.read() def close(self): self._stream.close() del self._gen, self._stream, self._streamIterator def split(self): return fallbackSplit(self) def generatorToStream(fun): """Converts a generator function into a stream. The function should take an iterator as its first argument, which will be converted *from* a stream by this wrapper, and yield items which are turned *into* the results from the stream's 'read' call. One important point: before every call to input.next(), you *MUST* do a "yield input.wait" first. Yielding this magic value takes care of ensuring that the input is not a deferred before you see it. >>> from twisted.web2 import stream >>> from string import maketrans >>> alphabet = 'abcdefghijklmnopqrstuvwxyz' >>> >>> def encrypt(input, key): ... code = alphabet[key:] + alphabet[:key] ... translator = maketrans(alphabet+alphabet.upper(), code+code.upper()) ... yield input.wait ... for s in input: ... yield str(s).translate(translator) ... yield input.wait ... >>> encrypt = stream.generatorToStream(encrypt) >>> >>> plaintextStream = stream.MemoryStream('SampleSampleSample') >>> encryptedStream = encrypt(plaintextStream, 13) >>> encryptedStream.read() 'FnzcyrFnzcyrFnzcyr' >>> >>> plaintextStream = stream.MemoryStream('SampleSampleSample') >>> encryptedStream = encrypt(plaintextStream, 13) >>> evenMoreEncryptedStream = encrypt(encryptedStream, 13) >>> evenMoreEncryptedStream.read() 'SampleSampleSample' """ def generatorToStream_inner(stream, *args, **kwargs): return _IteratorStream(fun, stream, args, kwargs) return generatorToStream_inner ############################## #### BufferedStream #### ############################## class BufferedStream(object): """A stream which buffers its data to provide operations like readline and readExactly.""" data = "" def __init__(self, stream): self.stream = stream def _readUntil(self, f): """Internal helper function which repeatedly calls f each time after more data has been received, until it returns non-None.""" while True: r = f() if r is not None: yield r; return newdata = self.stream.read() if isinstance(newdata, defer.Deferred): newdata = defer.waitForDeferred(newdata) yield newdata; newdata = newdata.getResult() if newdata is None: # End Of File newdata = self.data self.data = '' yield newdata; return self.data += str(newdata) _readUntil = defer.deferredGenerator(_readUntil) def readExactly(self, size=None): """Read exactly size bytes of data, or, if size is None, read the entire stream into a string.""" if size is not None and size < 0: raise ValueError("readExactly: size cannot be negative: %s", size) def gotdata(): data = self.data if size is not None and len(data) >= size: pre,post = data[:size], data[size:] self.data = post return pre return self._readUntil(gotdata) def readline(self, delimiter='\r\n', size=None): """ Read a line of data from the string, bounded by delimiter. The delimiter is included in the return value. If size is specified, read and return at most that many bytes, even if the delimiter has not yet been reached. If the size limit falls within a delimiter, the rest of the delimiter, and the next line will be returned together. """ if size is not None and size < 0: raise ValueError("readline: size cannot be negative: %s" % (size, )) def gotdata(): data = self.data if size is not None: splitpoint = data.find(delimiter, 0, size) if splitpoint == -1: if len(data) >= size: splitpoint = size else: splitpoint += len(delimiter) else: splitpoint = data.find(delimiter) if splitpoint != -1: splitpoint += len(delimiter) if splitpoint != -1: pre = data[:splitpoint] self.data = data[splitpoint:] return pre return self._readUntil(gotdata) def pushback(self, pushed): """Push data back into the buffer.""" self.data = pushed + self.data def read(self): data = self.data if data: self.data = "" return data return self.stream.read() def _len(self): l = self.stream.length if l is None: return None return l + len(self.data) length = property(_len) def split(self, offset): off = offset - len(self.data) pre, post = self.stream.split(max(0, off)) pre = BufferedStream(pre) post = BufferedStream(post) if off < 0: pre.data = self.data[:-off] post.data = self.data[-off:] else: pre.data = self.data return pre, post def substream(stream, start, end): if start > end: raise ValueError("start position must be less than end position %r" % ((start, end),)) stream = stream.split(start)[1] return stream.split(end - start)[0] __all__ = ['IStream', 'IByteStream', 'FileStream', 'MemoryStream', 'CompoundStream', 'readAndDiscard', 'fallbackSplit', 'ProducerStream', 'StreamProducer', 'BufferedStream', 'readStream', 'ProcessStreamer', 'readIntoFile', 'generatorToStream'] TwistedWeb2-8.1.0/twisted/web2/http_headers.py0000644000175000017500000014172110767246617017731 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_http_headers -*- from __future__ import generators import types, time from calendar import timegm import base64 import re def dashCapitalize(s): ''' Capitalize a string, making sure to treat - as a word seperator ''' return '-'.join([ x.capitalize() for x in s.split('-')]) # datetime parsing and formatting weekdayname = ['Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat', 'Sun'] weekdayname_lower = [name.lower() for name in weekdayname] monthname = [None, 'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'] monthname_lower = [name and name.lower() for name in monthname] # HTTP Header parsing API header_case_mapping = {} def casemappingify(d): global header_case_mapping newd = dict([(key.lower(),key) for key in d.keys()]) header_case_mapping.update(newd) def lowerify(d): return dict([(key.lower(),value) for key,value in d.items()]) class HeaderHandler(object): """HeaderHandler manages header generating and parsing functions. """ HTTPParsers = {} HTTPGenerators = {} def __init__(self, parsers=None, generators=None): """ @param parsers: A map of header names to parsing functions. @type parsers: L{dict} @param generators: A map of header names to generating functions. @type generators: L{dict} """ if parsers: self.HTTPParsers.update(parsers) if generators: self.HTTPGenerators.update(generators) def parse(self, name, header): """ Parse the given header based on its given name. @param name: The header name to parse. @type name: C{str} @param header: A list of unparsed headers. @type header: C{list} of C{str} @return: The return value is the parsed header representation, it is dependent on the header. See the HTTP Headers document. """ parser = self.HTTPParsers.get(name, None) if parser is None: raise ValueError("No header parser for header '%s', either add one or use getHeaderRaw." % (name,)) try: for p in parser: # print "Parsing %s: %s(%s)" % (name, repr(p), repr(h)) header = p(header) # if isinstance(h, types.GeneratorType): # h=list(h) except ValueError,v: # print v header=None return header def generate(self, name, header): """ Generate the given header based on its given name. @param name: The header name to generate. @type name: C{str} @param header: A parsed header, such as the output of L{HeaderHandler}.parse. @return: C{list} of C{str} each representing a generated HTTP header. """ generator = self.HTTPGenerators.get(name, None) if generator is None: # print self.generators raise ValueError("No header generator for header '%s', either add one or use setHeaderRaw." % (name,)) for g in generator: header = g(header) #self._raw_headers[name] = h return header def updateParsers(self, parsers): """Update en masse the parser maps. @param parsers: Map of header names to parser chains. @type parsers: C{dict} """ casemappingify(parsers) self.HTTPParsers.update(lowerify(parsers)) def addParser(self, name, value): """Add an individual parser chain for the given header. @param name: Name of the header to add @type name: C{str} @param value: The parser chain @type value: C{str} """ self.updateParsers({name: value}) def updateGenerators(self, generators): """Update en masse the generator maps. @param parsers: Map of header names to generator chains. @type parsers: C{dict} """ casemappingify(generators) self.HTTPGenerators.update(lowerify(generators)) def addGenerators(self, name, value): """Add an individual generator chain for the given header. @param name: Name of the header to add @type name: C{str} @param value: The generator chain @type value: C{str} """ self.updateGenerators({name: value}) def update(self, parsers, generators): """Conveniently update parsers and generators all at once. """ self.updateParsers(parsers) self.updateGenerators(generators) DefaultHTTPHandler = HeaderHandler() ## HTTP DateTime parser def parseDateTime(dateString): """Convert an HTTP date string (one of three formats) to seconds since epoch.""" parts = dateString.split() if not parts[0][0:3].lower() in weekdayname_lower: # Weekday is stupid. Might have been omitted. try: return parseDateTime("Sun, "+dateString) except ValueError: # Guess not. pass partlen = len(parts) if (partlen == 5 or partlen == 6) and parts[1].isdigit(): # 1st date format: Sun, 06 Nov 1994 08:49:37 GMT # (Note: "GMT" is literal, not a variable timezone) # (also handles without "GMT") # This is the normal format day = parts[1] month = parts[2] year = parts[3] time = parts[4] elif (partlen == 3 or partlen == 4) and parts[1].find('-') != -1: # 2nd date format: Sunday, 06-Nov-94 08:49:37 GMT # (Note: "GMT" is literal, not a variable timezone) # (also handles without without "GMT") # Two digit year, yucko. day, month, year = parts[1].split('-') time = parts[2] year=int(year) if year < 69: year = year + 2000 elif year < 100: year = year + 1900 elif len(parts) == 5: # 3rd date format: Sun Nov 6 08:49:37 1994 # ANSI C asctime() format. day = parts[2] month = parts[1] year = parts[4] time = parts[3] else: raise ValueError("Unknown datetime format %r" % dateString) day = int(day) month = int(monthname_lower.index(month.lower())) year = int(year) hour, min, sec = map(int, time.split(':')) return int(timegm((year, month, day, hour, min, sec))) ##### HTTP tokenizer class Token(str): __slots__=[] tokens = {} def __new__(self, char): token = Token.tokens.get(char) if token is None: Token.tokens[char] = token = str.__new__(self, char) return token def __repr__(self): return "Token(%s)" % str.__repr__(self) http_tokens = " \t\"()<>@,;:\\/[]?={}" http_ctls = "\x00\x01\x02\x03\x04\x05\x06\x07\x08\t\n\x0b\x0c\r\x0e\x0f\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f\x7f" def tokenize(header, foldCase=True): """Tokenize a string according to normal HTTP header parsing rules. In particular: - Whitespace is irrelevant and eaten next to special separator tokens. Its existance (but not amount) is important between character strings. - Quoted string support including embedded backslashes. - Case is insignificant (and thus lowercased), except in quoted strings. (unless foldCase=False) - Multiple headers are concatenated with ',' NOTE: not all headers can be parsed with this function. Takes a raw header value (list of strings), and Returns a generator of strings and Token class instances. """ tokens=http_tokens ctls=http_ctls string = ",".join(header) list = [] start = 0 cur = 0 quoted = False qpair = False inSpaces = -1 qstring = None for x in string: if quoted: if qpair: qpair = False qstring = qstring+string[start:cur-1]+x start = cur+1 elif x == '\\': qpair = True elif x == '"': quoted = False yield qstring+string[start:cur] qstring=None start = cur+1 elif x in tokens: if start != cur: if foldCase: yield string[start:cur].lower() else: yield string[start:cur] start = cur+1 if x == '"': quoted = True qstring = "" inSpaces = False elif x in " \t": if inSpaces is False: inSpaces = True else: inSpaces = -1 yield Token(x) elif x in ctls: raise ValueError("Invalid control character: %d in header" % ord(x)) else: if inSpaces is True: yield Token(' ') inSpaces = False inSpaces = False cur = cur+1 if qpair: raise ValueError, "Missing character after '\\'" if quoted: raise ValueError, "Missing end quote" if start != cur: if foldCase: yield string[start:cur].lower() else: yield string[start:cur] def split(seq, delim): """The same as str.split but works on arbitrary sequences. Too bad it's not builtin to python!""" cur = [] for item in seq: if item == delim: yield cur cur = [] else: cur.append(item) yield cur # def find(seq, *args): # """The same as seq.index but returns -1 if not found, instead # Too bad it's not builtin to python!""" # try: # return seq.index(value, *args) # except ValueError: # return -1 def filterTokens(seq): """Filter out instances of Token, leaving only a list of strings. Used instead of a more specific parsing method (e.g. splitting on commas) when only strings are expected, so as to be a little lenient. Apache does it this way and has some comments about broken clients which forget commas (?), so I'm doing it the same way. It shouldn't hurt anything, in any case. """ l=[] for x in seq: if not isinstance(x, Token): l.append(x) return l ##### parser utilities: def checkSingleToken(tokens): if len(tokens) != 1: raise ValueError, "Expected single token, not %s." % (tokens,) return tokens[0] def parseKeyValue(val): if len(val) == 1: return val[0],None elif len(val) == 3 and val[1] == Token('='): return val[0],val[2] raise ValueError, "Expected key or key=value, but got %s." % (val,) def parseArgs(field): args=split(field, Token(';')) val = args.next() args = [parseKeyValue(arg) for arg in args] return val,args def listParser(fun): """Return a function which applies 'fun' to every element in the comma-separated list""" def listParserHelper(tokens): fields = split(tokens, Token(',')) for field in fields: if len(field) != 0: yield fun(field) return listParserHelper def last(seq): """Return seq[-1]""" return seq[-1] ##### Generation utilities def quoteString(s): return '"%s"' % s.replace('\\', '\\\\').replace('"', '\\"') def listGenerator(fun): """Return a function which applies 'fun' to every element in the given list, then joins the result with generateList""" def listGeneratorHelper(l): return generateList([fun(e) for e in l]) return listGeneratorHelper def generateList(seq): return ", ".join(seq) def singleHeader(item): return [item] def generateKeyValues(kvs): l = [] # print kvs for k,v in kvs: if v is None: l.append('%s' % k) else: l.append('%s=%s' % (k,v)) return ";".join(l) class MimeType(object): def fromString(klass, mimeTypeString): """Generate a MimeType object from the given string. @param mimeTypeString: The mimetype to parse @return: L{MimeType} """ return DefaultHTTPHandler.parse('content-type', [mimeTypeString]) fromString = classmethod(fromString) def __init__(self, mediaType, mediaSubtype, params={}, **kwargs): """ @type mediaType: C{str} @type mediaSubtype: C{str} @type params: C{dict} """ self.mediaType = mediaType self.mediaSubtype = mediaSubtype self.params = dict(params) if kwargs: self.params.update(kwargs) def __eq__(self, other): if not isinstance(other, MimeType): return NotImplemented return (self.mediaType == other.mediaType and self.mediaSubtype == other.mediaSubtype and self.params == other.params) def __ne__(self, other): return not self.__eq__(other) def __repr__(self): return "MimeType(%r, %r, %r)" % (self.mediaType, self.mediaSubtype, self.params) def __hash__(self): return hash(self.mediaType)^hash(self.mediaSubtype)^hash(tuple(self.params.iteritems())) ##### Specific header parsers. def parseAccept(field): type,args = parseArgs(field) if len(type) != 3 or type[1] != Token('/'): raise ValueError, "MIME Type "+str(type)+" invalid." # okay, this spec is screwy. A 'q' parameter is used as the separator # between MIME parameters and (as yet undefined) additional HTTP # parameters. num = 0 for arg in args: if arg[0] == 'q': mimeparams=tuple(args[0:num]) params=args[num:] break num = num + 1 else: mimeparams=tuple(args) params=[] # Default values for parameters: qval = 1.0 # Parse accept parameters: for param in params: if param[0] =='q': qval = float(param[1]) else: # Warn? ignored parameter. pass ret = MimeType(type[0],type[2],mimeparams),qval return ret def parseAcceptQvalue(field): type,args=parseArgs(field) type = checkSingleToken(type) qvalue = 1.0 # Default qvalue is 1 for arg in args: if arg[0] == 'q': qvalue = float(arg[1]) return type,qvalue def addDefaultCharset(charsets): if charsets.get('*') is None and charsets.get('iso-8859-1') is None: charsets['iso-8859-1'] = 1.0 return charsets def addDefaultEncoding(encodings): if encodings.get('*') is None and encodings.get('identity') is None: # RFC doesn't specify a default value for identity, only that it # "is acceptable" if not mentioned. Thus, give it a very low qvalue. encodings['identity'] = .0001 return encodings def parseContentType(header): # Case folding is disabled for this header, because of use of # Content-Type: multipart/form-data; boundary=CaSeFuLsTuFf # So, we need to explicitly .lower() the type/subtype and arg keys. type,args = parseArgs(header) if len(type) != 3 or type[1] != Token('/'): raise ValueError, "MIME Type "+str(type)+" invalid." args = [(kv[0].lower(), kv[1]) for kv in args] return MimeType(type[0].lower(), type[2].lower(), tuple(args)) def parseContentMD5(header): try: return base64.decodestring(header) except Exception,e: raise ValueError(e) def parseContentRange(header): """Parse a content-range header into (kind, start, end, realLength). realLength might be None if real length is not known ('*'). start and end might be None if start,end unspecified (for response code 416) """ kind, other = header.strip().split() if kind.lower() != "bytes": raise ValueError("a range of type %r is not supported") startend, realLength = other.split("/") if startend.strip() == '*': start,end=None,None else: start, end = map(int, startend.split("-")) if realLength == "*": realLength = None else: realLength = int(realLength) return (kind, start, end, realLength) def parseExpect(field): type,args=parseArgs(field) type=parseKeyValue(type) return (type[0], (lambda *args:args)(type[1], *args)) def parseExpires(header): # """HTTP/1.1 clients and caches MUST treat other invalid date formats, # especially including the value 0, as in the past (i.e., "already expired").""" try: return parseDateTime(header) except ValueError: return 0 def parseIfModifiedSince(header): # Ancient versions of netscape and *current* versions of MSIE send # If-Modified-Since: Thu, 05 Aug 2004 12:57:27 GMT; length=123 # which is blantantly RFC-violating and not documented anywhere # except bug-trackers for web frameworks. # So, we'll just strip off everything after a ';'. return parseDateTime(header.split(';', 1)[0]) def parseIfRange(headers): try: return ETag.parse(tokenize(headers)) except ValueError: return parseDateTime(last(headers)) def parseRange(range): range = list(range) if len(range) < 3 or range[1] != Token('='): raise ValueError("Invalid range header format: %s" %(range,)) type=range[0] if type != 'bytes': raise ValueError("Unknown range unit: %s." % (type,)) rangeset=split(range[2:], Token(',')) ranges = [] for byterangespec in rangeset: if len(byterangespec) != 1: raise ValueError("Invalid range header format: %s" % (range,)) start,end=byterangespec[0].split('-') if not start and not end: raise ValueError("Invalid range header format: %s" % (range,)) if start: start = int(start) else: start = None if end: end = int(end) else: end = None if start and end and start > end: raise ValueError("Invalid range header, start > end: %s" % (range,)) ranges.append((start,end)) return type,ranges def parseRetryAfter(header): try: # delta seconds return time.time() + int(header) except ValueError: # or datetime return parseDateTime(header) # WWW-Authenticate and Authorization def parseWWWAuthenticate(tokenized): headers = [] tokenList = list(tokenized) while tokenList: scheme = tokenList.pop(0) challenge = {} last = None kvChallenge = False while tokenList: token = tokenList.pop(0) if token == Token('='): kvChallenge = True challenge[last] = tokenList.pop(0) last = None elif token == Token(','): if kvChallenge: if len(tokenList) > 1 and tokenList[1] != Token('='): break else: break else: last = token if last and scheme and not challenge and not kvChallenge: challenge = last last = None headers.append((scheme, challenge)) if last and last not in (Token('='), Token(',')): if headers[-1] == (scheme, challenge): scheme = last challenge = {} headers.append((scheme, challenge)) return headers def parseAuthorization(header): scheme, rest = header.split(' ', 1) # this header isn't tokenized because it may eat characters # in the unquoted base64 encoded credentials return scheme.lower(), rest #### Header generators def generateAccept(accept): mimeType,q = accept out="%s/%s"%(mimeType.mediaType, mimeType.mediaSubtype) if mimeType.params: out+=';'+generateKeyValues(mimeType.params.iteritems()) if q != 1.0: out+=(';q=%.3f' % (q,)).rstrip('0').rstrip('.') return out def removeDefaultEncoding(seq): for item in seq: if item[0] != 'identity' or item[1] != .0001: yield item def generateAcceptQvalue(keyvalue): if keyvalue[1] == 1.0: return "%s" % keyvalue[0:1] else: return ("%s;q=%.3f" % keyvalue).rstrip('0').rstrip('.') def parseCacheControl(kv): k, v = parseKeyValue(kv) if k == 'max-age' or k == 'min-fresh' or k == 's-maxage': # Required integer argument if v is None: v = 0 else: v = int(v) elif k == 'max-stale': # Optional integer argument if v is not None: v = int(v) elif k == 'private' or k == 'no-cache': # Optional list argument if v is not None: v = [field.strip().lower() for field in v.split(',')] return k, v def generateCacheControl((k, v)): if v is None: return str(k) else: if k == 'no-cache' or k == 'private': # quoted list of values v = quoteString(generateList( [header_case_mapping.get(name) or dashCapitalize(name) for name in v])) return '%s=%s' % (k,v) def generateContentRange(tup): """tup is (type, start, end, len) len can be None. """ type, start, end, len = tup if len == None: len = '*' else: len = int(len) if start == None and end == None: startend = '*' else: startend = '%d-%d' % (start, end) return '%s %s/%s' % (type, startend, len) def generateDateTime(secSinceEpoch): """Convert seconds since epoch to HTTP datetime string.""" year, month, day, hh, mm, ss, wd, y, z = time.gmtime(secSinceEpoch) s = "%s, %02d %3s %4d %02d:%02d:%02d GMT" % ( weekdayname[wd], day, monthname[month], year, hh, mm, ss) return s def generateExpect(item): if item[1][0] is None: out = '%s' % (item[0],) else: out = '%s=%s' % (item[0], item[1][0]) if len(item[1]) > 1: out += ';'+generateKeyValues(item[1][1:]) return out def generateRange(range): def noneOr(s): if s is None: return '' return s type,ranges=range if type != 'bytes': raise ValueError("Unknown range unit: "+type+".") return (type+'='+ ','.join(['%s-%s' % (noneOr(startend[0]), noneOr(startend[1])) for startend in ranges])) def generateRetryAfter(when): # always generate delta seconds format return str(int(when - time.time())) def generateContentType(mimeType): out="%s/%s"%(mimeType.mediaType, mimeType.mediaSubtype) if mimeType.params: out+=';'+generateKeyValues(mimeType.params.iteritems()) return out def generateIfRange(dateOrETag): if isinstance(dateOrETag, ETag): return dateOrETag.generate() else: return generateDateTime(dateOrETag) # WWW-Authenticate and Authorization def generateWWWAuthenticate(headers): _generated = [] for seq in headers: scheme, challenge = seq[0], seq[1] # If we're going to parse out to something other than a dict # we need to be able to generate from something other than a dict try: l = [] for k,v in dict(challenge).iteritems(): l.append("%s=%s" % (k, quoteString(v))) _generated.append("%s %s" % (scheme, ", ".join(l))) except ValueError: _generated.append("%s %s" % (scheme, challenge)) return _generated def generateAuthorization(seq): return [' '.join(seq)] #### class ETag(object): def __init__(self, tag, weak=False): self.tag = str(tag) self.weak = weak def match(self, other, strongCompare): # Sec 13.3. # The strong comparison function: in order to be considered equal, both # validators MUST be identical in every way, and both MUST NOT be weak. # # The weak comparison function: in order to be considered equal, both # validators MUST be identical in every way, but either or both of # them MAY be tagged as "weak" without affecting the result. if not isinstance(other, ETag) or other.tag != self.tag: return False if strongCompare and (other.weak or self.weak): return False return True def __eq__(self, other): return isinstance(other, ETag) and other.tag == self.tag and other.weak == self.weak def __ne__(self, other): return not self.__eq__(other) def __repr__(self): return "Etag(%r, weak=%r)" % (self.tag, self.weak) def parse(tokens): tokens=tuple(tokens) if len(tokens) == 1 and not isinstance(tokens[0], Token): return ETag(tokens[0]) if(len(tokens) == 3 and tokens[0] == "w" and tokens[1] == Token('/')): return ETag(tokens[2], weak=True) raise ValueError("Invalid ETag.") parse=staticmethod(parse) def generate(self): if self.weak: return 'W/'+quoteString(self.tag) else: return quoteString(self.tag) def parseStarOrETag(tokens): tokens=tuple(tokens) if tokens == ('*',): return '*' else: return ETag.parse(tokens) def generateStarOrETag(etag): if etag=='*': return etag else: return etag.generate() #### Cookies. Blech! class Cookie(object): # __slots__ = ['name', 'value', 'path', 'domain', 'ports', 'expires', 'discard', 'secure', 'comment', 'commenturl', 'version'] def __init__(self, name, value, path=None, domain=None, ports=None, expires=None, discard=False, secure=False, comment=None, commenturl=None, version=0): self.name=name self.value=value self.path=path self.domain=domain self.ports=ports self.expires=expires self.discard=discard self.secure=secure self.comment=comment self.commenturl=commenturl self.version=version def __repr__(self): s="Cookie(%r=%r" % (self.name, self.value) if self.path is not None: s+=", path=%r" % (self.path,) if self.domain is not None: s+=", domain=%r" % (self.domain,) if self.ports is not None: s+=", ports=%r" % (self.ports,) if self.expires is not None: s+=", expires=%r" % (self.expires,) if self.secure is not False: s+=", secure=%r" % (self.secure,) if self.comment is not None: s+=", comment=%r" % (self.comment,) if self.commenturl is not None: s+=", commenturl=%r" % (self.commenturl,) if self.version != 0: s+=", version=%r" % (self.version,) s+=")" return s def __eq__(self, other): return (isinstance(other, Cookie) and other.path == self.path and other.domain == self.domain and other.ports == self.ports and other.expires == self.expires and other.secure == self.secure and other.comment == self.comment and other.commenturl == self.commenturl and other.version == self.version) def __ne__(self, other): return not self.__eq__(other) def parseCookie(headers): """Bleargh, the cookie spec sucks. This surely needs interoperability testing. There are two specs that are supported: Version 0) http://wp.netscape.com/newsref/std/cookie_spec.html Version 1) http://www.faqs.org/rfcs/rfc2965.html """ cookies = [] # There can't really be multiple cookie headers according to RFC, because # if multiple headers are allowed, they must be joinable with ",". # Neither new RFC2965 cookies nor old netscape cookies are. header = ';'.join(headers) if header[0:8].lower() == "$version": # RFC2965 cookie h=tokenize([header], foldCase=False) r_cookies = split(h, Token(',')) for r_cookie in r_cookies: last_cookie = None rr_cookies = split(r_cookie, Token(';')) for cookie in rr_cookies: nameval = tuple(split(cookie, Token('='))) if len(nameval) == 2: (name,), (value,) = nameval else: (name,), = nameval value = None name=name.lower() if name == '$version': continue if name[0] == '$': if last_cookie is not None: if name == '$path': last_cookie.path=value elif name == '$domain': last_cookie.domain=value elif name == '$port': if value is None: last_cookie.ports = () else: last_cookie.ports=tuple([int(s) for s in value.split(',')]) else: last_cookie = Cookie(name, value, version=1) cookies.append(last_cookie) else: # Oldstyle cookies don't do quoted strings or anything sensible. # All characters are valid for names except ';' and '=', and all # characters are valid for values except ';'. Spaces are stripped, # however. r_cookies = header.split(';') for r_cookie in r_cookies: name,value = r_cookie.split('=', 1) name=name.strip(' \t') value=value.strip(' \t') cookies.append(Cookie(name, value)) return cookies cookie_validname = "[^"+re.escape(http_tokens+http_ctls)+"]*$" cookie_validname_re = re.compile(cookie_validname) cookie_validvalue = cookie_validname+'|"([^"]|\\\\")*"$' cookie_validvalue_re = re.compile(cookie_validvalue) def generateCookie(cookies): # There's a fundamental problem with the two cookie specifications. # They both use the "Cookie" header, and the RFC Cookie header only allows # one version to be specified. Thus, when you have a collection of V0 and # V1 cookies, you have to either send them all as V0 or send them all as # V1. # I choose to send them all as V1. # You might think converting a V0 cookie to a V1 cookie would be lossless, # but you'd be wrong. If you do the conversion, and a V0 parser tries to # read the cookie, it will see a modified form of the cookie, in cases # where quotes must be added to conform to proper V1 syntax. # (as a real example: "Cookie: cartcontents=oid:94680,qty:1,auto:0,esp:y") # However, that is what we will do, anyways. It has a high probability of # breaking applications that only handle oldstyle cookies, where some other # application set a newstyle cookie that is applicable over for site # (or host), AND where the oldstyle cookie uses a value which is invalid # syntax in a newstyle cookie. # Also, the cookie name *cannot* be quoted in V1, so some cookies just # cannot be converted at all. (e.g. "Cookie: phpAds_capAd[32]=2"). These # are just dicarded during conversion. # As this is an unsolvable problem, I will pretend I can just say # OH WELL, don't do that, or else upgrade your old applications to have # newstyle cookie parsers. # I will note offhandedly that there are *many* sites which send V0 cookies # that are not valid V1 cookie syntax. About 20% for my cookies file. # However, they do not generally mix them with V1 cookies, so this isn't # an issue, at least right now. I have not tested to see how many of those # webapps support RFC2965 V1 cookies. I suspect not many. max_version = max([cookie.version for cookie in cookies]) if max_version == 0: # no quoting or anything. return ';'.join(["%s=%s" % (cookie.name, cookie.value) for cookie in cookies]) else: str_cookies = ['$Version="1"'] for cookie in cookies: if cookie.version == 0: # Version 0 cookie: we make sure the name and value are valid # V1 syntax. # If they are, we use them as is. This means in *most* cases, # the cookie will look literally the same on output as it did # on input. # If it isn't a valid name, ignore the cookie. # If it isn't a valid value, quote it and hope for the best on # the other side. if cookie_validname_re.match(cookie.name) is None: continue value=cookie.value if cookie_validvalue_re.match(cookie.value) is None: value = quoteString(value) str_cookies.append("%s=%s" % (cookie.name, value)) else: # V1 cookie, nice and easy str_cookies.append("%s=%s" % (cookie.name, quoteString(cookie.value))) if cookie.path: str_cookies.append("$Path=%s" % quoteString(cookie.path)) if cookie.domain: str_cookies.append("$Domain=%s" % quoteString(cookie.domain)) if cookie.ports is not None: if len(cookie.ports) == 0: str_cookies.append("$Port") else: str_cookies.append("$Port=%s" % quoteString(",".join([str(x) for x in cookie.ports]))) return ';'.join(str_cookies) def parseSetCookie(headers): setCookies = [] for header in headers: try: parts = header.split(';') l = [] for part in parts: namevalue = part.split('=',1) if len(namevalue) == 1: name=namevalue[0] value=None else: name,value=namevalue value=value.strip(' \t') name=name.strip(' \t') l.append((name, value)) setCookies.append(makeCookieFromList(l, True)) except ValueError: # If we can't parse one Set-Cookie, ignore it, # but not the rest of Set-Cookies. pass return setCookies def parseSetCookie2(toks): outCookies = [] for cookie in [[parseKeyValue(x) for x in split(y, Token(';'))] for y in split(toks, Token(','))]: try: outCookies.append(makeCookieFromList(cookie, False)) except ValueError: # Again, if we can't handle one cookie -- ignore it. pass return outCookies def makeCookieFromList(tup, netscapeFormat): name, value = tup[0] if name is None or value is None: raise ValueError("Cookie has missing name or value") if name.startswith("$"): raise ValueError("Invalid cookie name: %r, starts with '$'." % name) cookie = Cookie(name, value) hadMaxAge = False for name,value in tup[1:]: name = name.lower() if value is None: if name in ("discard", "secure"): # Boolean attrs value = True elif name != "port": # Can be either boolean or explicit continue if name in ("comment", "commenturl", "discard", "domain", "path", "secure"): # simple cases setattr(cookie, name, value) elif name == "expires" and not hadMaxAge: if netscapeFormat and value[0] == '"' and value[-1] == '"': value = value[1:-1] cookie.expires = parseDateTime(value) elif name == "max-age": hadMaxAge = True cookie.expires = int(value) + time.time() elif name == "port": if value is None: cookie.ports = () else: if netscapeFormat and value[0] == '"' and value[-1] == '"': value = value[1:-1] cookie.ports = tuple([int(s) for s in value.split(',')]) elif name == "version": cookie.version = int(value) return cookie def generateSetCookie(cookies): setCookies = [] for cookie in cookies: out = ["%s=%s" % (cookie.name, cookie.value)] if cookie.expires: out.append("expires=%s" % generateDateTime(cookie.expires)) if cookie.path: out.append("path=%s" % cookie.path) if cookie.domain: out.append("domain=%s" % cookie.domain) if cookie.secure: out.append("secure") setCookies.append('; '.join(out)) return setCookies def generateSetCookie2(cookies): setCookies = [] for cookie in cookies: out = ["%s=%s" % (cookie.name, quoteString(cookie.value))] if cookie.comment: out.append("Comment=%s" % quoteString(cookie.comment)) if cookie.commenturl: out.append("CommentURL=%s" % quoteString(cookie.commenturl)) if cookie.discard: out.append("Discard") if cookie.domain: out.append("Domain=%s" % quoteString(cookie.domain)) if cookie.expires: out.append("Max-Age=%s" % (cookie.expires - time.time())) if cookie.path: out.append("Path=%s" % quoteString(cookie.path)) if cookie.ports is not None: if len(cookie.ports) == 0: out.append("Port") else: out.append("Port=%s" % quoteString(",".join([str(x) for x in cookie.ports]))) if cookie.secure: out.append("Secure") out.append('Version="1"') setCookies.append('; '.join(out)) return setCookies def parseDepth(depth): if depth not in ("0", "1", "infinity"): raise ValueError("Invalid depth header value: %s" % (depth,)) return depth def parseOverWrite(overwrite): if overwrite == "F": return False elif overwrite == "T": return True raise ValueError("Invalid overwrite header value: %s" % (overwrite,)) def generateOverWrite(overwrite): if overwrite: return "T" else: return "F" ##### Random stuff that looks useful. # def sortMimeQuality(s): # def sorter(item1, item2): # if item1[0] == '*': # if item2[0] == '*': # return 0 # def sortQuality(s): # def sorter(item1, item2): # if item1[1] < item2[1]: # return -1 # if item1[1] < item2[1]: # return 1 # if item1[0] == item2[0]: # return 0 # def getMimeQuality(mimeType, accepts): # type,args = parseArgs(mimeType) # type=type.split(Token('/')) # if len(type) != 2: # raise ValueError, "MIME Type "+s+" invalid." # for accept in accepts: # accept,acceptQual=accept # acceptType=accept[0:1] # acceptArgs=accept[2] # if ((acceptType == type or acceptType == (type[0],'*') or acceptType==('*','*')) and # (args == acceptArgs or len(acceptArgs) == 0)): # return acceptQual # def getQuality(type, accepts): # qual = accepts.get(type) # if qual is not None: # return qual # return accepts.get('*') # Headers object class __RecalcNeeded(object): def __repr__(self): return "" _RecalcNeeded = __RecalcNeeded() class Headers(object): """This class stores the HTTP headers as both a parsed representation and the raw string representation. It converts between the two on demand.""" def __init__(self, headers=None, rawHeaders=None, handler=DefaultHTTPHandler): self._raw_headers = {} self._headers = {} self.handler = handler if headers is not None: for key, value in headers.iteritems(): self.setHeader(key, value) if rawHeaders is not None: for key, value in rawHeaders.iteritems(): self.setRawHeaders(key, value) def _setRawHeaders(self, headers): self._raw_headers = headers self._headers = {} def _toParsed(self, name): r = self._raw_headers.get(name, None) h = self.handler.parse(name, r) if h is not None: self._headers[name] = h return h def _toRaw(self, name): h = self._headers.get(name, None) r = self.handler.generate(name, h) if r is not None: self._raw_headers[name] = r return r def hasHeader(self, name): """Does a header with the given name exist?""" name=name.lower() return self._raw_headers.has_key(name) def getRawHeaders(self, name, default=None): """Returns a list of headers matching the given name as the raw string given.""" name=name.lower() raw_header = self._raw_headers.get(name, default) if raw_header is not _RecalcNeeded: return raw_header return self._toRaw(name) def getHeader(self, name, default=None): """Ret9urns the parsed representation of the given header. The exact form of the return value depends on the header in question. If no parser for the header exists, raise ValueError. If the header doesn't exist, return default (or None if not specified) """ name=name.lower() parsed = self._headers.get(name, default) if parsed is not _RecalcNeeded: return parsed return self._toParsed(name) def setRawHeaders(self, name, value): """Sets the raw representation of the given header. Value should be a list of strings, each being one header of the given name. """ name=name.lower() self._raw_headers[name] = value self._headers[name] = _RecalcNeeded def setHeader(self, name, value): """Sets the parsed representation of the given header. Value should be a list of objects whose exact form depends on the header in question. """ name=name.lower() self._raw_headers[name] = _RecalcNeeded self._headers[name] = value def addRawHeader(self, name, value): """ Add a raw value to a header that may or may not already exist. If it exists, add it as a separate header to output; do not replace anything. """ name=name.lower() raw_header = self._raw_headers.get(name) if raw_header is None: # No header yet raw_header = [] self._raw_headers[name] = raw_header elif raw_header is _RecalcNeeded: raw_header = self._toRaw(name) raw_header.append(value) self._headers[name] = _RecalcNeeded def removeHeader(self, name): """Removes the header named.""" name=name.lower() if self._raw_headers.has_key(name): del self._raw_headers[name] del self._headers[name] def __repr__(self): return ''% (self._raw_headers, self._headers) def canonicalNameCaps(self, name): """Return the name with the canonical capitalization, if known, otherwise, Caps-After-Dashes""" return header_case_mapping.get(name) or dashCapitalize(name) def getAllRawHeaders(self): """Return an iterator of key,value pairs of all headers contained in this object, as strings. The keys are capitalized in canonical capitalization.""" for k,v in self._raw_headers.iteritems(): if v is _RecalcNeeded: v = self._toRaw(k) yield self.canonicalNameCaps(k), v def makeImmutable(self): """Make this header set immutable. All mutating operations will raise an exception.""" self.setHeader = self.setRawHeaders = self.removeHeader = self._mutateRaise def _mutateRaise(self, *args): raise AttributeError("This header object is immutable as the headers have already been sent.") """The following dicts are all mappings of header to list of operations to perform. The first operation should generally be 'tokenize' if the header can be parsed according to the normal tokenization rules. If it cannot, generally the first thing you want to do is take only the last instance of the header (in case it was sent multiple times, which is strictly an error, but we're nice.). """ iteritems = lambda x: x.iteritems() parser_general_headers = { 'Cache-Control':(tokenize, listParser(parseCacheControl), dict), 'Connection':(tokenize,filterTokens), 'Date':(last,parseDateTime), # 'Pragma':tokenize # 'Trailer':tokenize 'Transfer-Encoding':(tokenize,filterTokens), # 'Upgrade':tokenize # 'Via':tokenize,stripComment # 'Warning':tokenize } generator_general_headers = { 'Cache-Control':(iteritems, listGenerator(generateCacheControl), singleHeader), 'Connection':(generateList,singleHeader), 'Date':(generateDateTime,singleHeader), # 'Pragma': # 'Trailer': 'Transfer-Encoding':(generateList,singleHeader), # 'Upgrade': # 'Via': # 'Warning': } parser_request_headers = { 'Accept': (tokenize, listParser(parseAccept), dict), 'Accept-Charset': (tokenize, listParser(parseAcceptQvalue), dict, addDefaultCharset), 'Accept-Encoding':(tokenize, listParser(parseAcceptQvalue), dict, addDefaultEncoding), 'Accept-Language':(tokenize, listParser(parseAcceptQvalue), dict), 'Authorization': (last, parseAuthorization), 'Cookie':(parseCookie,), 'Expect':(tokenize, listParser(parseExpect), dict), 'From':(last,), 'Host':(last,), 'If-Match':(tokenize, listParser(parseStarOrETag), list), 'If-Modified-Since':(last, parseIfModifiedSince), 'If-None-Match':(tokenize, listParser(parseStarOrETag), list), 'If-Range':(parseIfRange,), 'If-Unmodified-Since':(last,parseDateTime), 'Max-Forwards':(last,int), # 'Proxy-Authorization':str, # what is "credentials" 'Range':(tokenize, parseRange), 'Referer':(last,str), # TODO: URI object? 'TE':(tokenize, listParser(parseAcceptQvalue), dict), 'User-Agent':(last,str), } generator_request_headers = { 'Accept': (iteritems,listGenerator(generateAccept),singleHeader), 'Accept-Charset': (iteritems, listGenerator(generateAcceptQvalue),singleHeader), 'Accept-Encoding': (iteritems, removeDefaultEncoding, listGenerator(generateAcceptQvalue),singleHeader), 'Accept-Language': (iteritems, listGenerator(generateAcceptQvalue),singleHeader), 'Authorization': (generateAuthorization,), # what is "credentials" 'Cookie':(generateCookie,singleHeader), 'Expect':(iteritems, listGenerator(generateExpect), singleHeader), 'From':(str,singleHeader), 'Host':(str,singleHeader), 'If-Match':(listGenerator(generateStarOrETag), singleHeader), 'If-Modified-Since':(generateDateTime,singleHeader), 'If-None-Match':(listGenerator(generateStarOrETag), singleHeader), 'If-Range':(generateIfRange, singleHeader), 'If-Unmodified-Since':(generateDateTime,singleHeader), 'Max-Forwards':(str, singleHeader), # 'Proxy-Authorization':str, # what is "credentials" 'Range':(generateRange,singleHeader), 'Referer':(str,singleHeader), 'TE': (iteritems, listGenerator(generateAcceptQvalue),singleHeader), 'User-Agent':(str,singleHeader), } parser_response_headers = { 'Accept-Ranges':(tokenize, filterTokens), 'Age':(last,int), 'ETag':(tokenize, ETag.parse), 'Location':(last,), # TODO: URI object? # 'Proxy-Authenticate' 'Retry-After':(last, parseRetryAfter), 'Server':(last,), 'Set-Cookie':(parseSetCookie,), 'Set-Cookie2':(tokenize, parseSetCookie2), 'Vary':(tokenize, filterTokens), 'WWW-Authenticate': (lambda h: tokenize(h, foldCase=False), parseWWWAuthenticate,) } generator_response_headers = { 'Accept-Ranges':(generateList, singleHeader), 'Age':(str, singleHeader), 'ETag':(ETag.generate, singleHeader), 'Location':(str, singleHeader), # 'Proxy-Authenticate' 'Retry-After':(generateRetryAfter, singleHeader), 'Server':(str, singleHeader), 'Set-Cookie':(generateSetCookie,), 'Set-Cookie2':(generateSetCookie2,), 'Vary':(generateList, singleHeader), 'WWW-Authenticate':(generateWWWAuthenticate,) } parser_entity_headers = { 'Allow':(lambda str:tokenize(str, foldCase=False), filterTokens), 'Content-Encoding':(tokenize, filterTokens), 'Content-Language':(tokenize, filterTokens), 'Content-Length':(last, int), 'Content-Location':(last,), # TODO: URI object? 'Content-MD5':(last, parseContentMD5), 'Content-Range':(last, parseContentRange), 'Content-Type':(lambda str:tokenize(str, foldCase=False), parseContentType), 'Expires':(last, parseExpires), 'Last-Modified':(last, parseDateTime), } generator_entity_headers = { 'Allow':(generateList, singleHeader), 'Content-Encoding':(generateList, singleHeader), 'Content-Language':(generateList, singleHeader), 'Content-Length':(str, singleHeader), 'Content-Location':(str, singleHeader), 'Content-MD5':(base64.encodestring, lambda x: x.strip("\n"), singleHeader), 'Content-Range':(generateContentRange, singleHeader), 'Content-Type':(generateContentType, singleHeader), 'Expires':(generateDateTime, singleHeader), 'Last-Modified':(generateDateTime, singleHeader), } DefaultHTTPHandler.updateParsers(parser_general_headers) DefaultHTTPHandler.updateParsers(parser_request_headers) DefaultHTTPHandler.updateParsers(parser_response_headers) DefaultHTTPHandler.updateParsers(parser_entity_headers) DefaultHTTPHandler.updateGenerators(generator_general_headers) DefaultHTTPHandler.updateGenerators(generator_request_headers) DefaultHTTPHandler.updateGenerators(generator_response_headers) DefaultHTTPHandler.updateGenerators(generator_entity_headers) # casemappingify(DefaultHTTPParsers) # casemappingify(DefaultHTTPGenerators) # lowerify(DefaultHTTPParsers) # lowerify(DefaultHTTPGenerators) TwistedWeb2-8.1.0/twisted/web2/plugin.py0000644000175000017500000000332010340730377016531 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_plugin -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """I'm a set of utility functions and resources for using twisted.plugins to locate resources. Example Usage: root.putChild('test', resourcePlugger('TestResource')) """ from twisted.web2 import resource, http, iweb from twisted.plugin import getPlugins from twisted.python.reflect import namedClass class PluginResource(resource.Resource): def __init__(self, *args, **kwargs): """A plugin resource atleast has to accept any arguments given to it, but it doesn't have to do anything with it, this is dumb I know. """ pass class TestResource(PluginResource, resource.LeafResource): def __init__(self, foo=None, bar=None): self.foo = foo self.bar = bar def locateChild(self, req, segments): return resource.LeafResource.locateChild(self, req, segments) def render(self, req): return http.Response(200, stream="I am a very simple resource, a pluggable resource too") class NoPlugin(resource.LeafResource): def __init__(self, plugin): self.plugin = plugin def render(self, req): return http.Response(404, stream="No Such Plugin %s" % self.plugin) def resourcePlugger(name, *args, **kwargs): resrcClass = None for p in getPlugins(iweb.IResource): if p.name == name: resrcClass = namedClass(p.className) break if resrcClass is None: resrcClass = kwargs.get('defaultResource', None) if resrcClass is None: return NoPlugin(name) del kwargs['defaultResource'] return resrcClass(*args, **kwargs) TwistedWeb2-8.1.0/twisted/web2/__init__.py0000644000175000017500000000037610431431321016766 0ustar dokodoko# -*- test-case-name: twisted.web2.test -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """ Twisted Web2: a better Twisted Web Server. """ from twisted.web2._version import version __version__ = version.short() TwistedWeb2-8.1.0/twisted/web2/twscgi.py0000644000175000017500000001166710442325576016554 0ustar dokodoko"""SCGI client resource and protocols. """ # TODO: # * Handle scgi server death, half way through a resonse. from zope.interface import implements from twisted.internet import defer, protocol, reactor from twisted.protocols import basic from twisted.web2 import http, iweb, resource, responsecode, stream, twcgi class SCGIClientResource(resource.LeafResource): """A resource that connects to an SCGI server and relays the server's response to the browser. This resource connects to a SCGI server on a known host ('localhost', by default) and port. It has no responsibility for starting the SCGI server. If the server is not running when a client connects then a BAD_GATEWAY response will be returned immediately. """ def __init__(self, port, host='localhost'): """Initialise a SCGI client resource """ resource.LeafResource.__init__(self) self.host = host self.port = port def renderHTTP(self, request): return doSCGI(request, self.host, self.port) def doSCGI(request, host, port): if request.stream.length is None: return http.Response(responsecode.LENGTH_REQUIRED) factory = SCGIClientProtocolFactory(request) reactor.connectTCP(host, port, factory) return factory.deferred class SCGIClientProtocol(basic.LineReceiver): """Protocol for talking to a SCGI server. """ def __init__(self, request, deferred): self.request = request self.deferred = deferred self.stream = stream.ProducerStream() self.response = http.Response(stream=self.stream) def connectionMade(self): # Ooh, look someone did all the hard work for me :). env = twcgi.createCGIEnvironment(self.request) # Send the headers. The Content-Length header should always be sent # first and must be 0 if not present. # The whole lot is sent as one big netstring with each name and value # separated by a '\0'. contentLength = str(env.pop('CONTENT_LENGTH', 0)) env['SCGI'] = '1' scgiHeaders = [] scgiHeaders.append('%s\x00%s\x00'%('CONTENT_LENGTH', str(contentLength))) scgiHeaders.append('SCGI\x001\x00') for name, value in env.iteritems(): if name in ('CONTENT_LENGTH', 'SCGI'): continue scgiHeaders.append('%s\x00%s\x00'%(name,value)) scgiHeaders = ''.join(scgiHeaders) self.transport.write('%d:%s,' % (len(scgiHeaders), scgiHeaders)) stream.StreamProducer(self.request.stream).beginProducing(self.transport) def lineReceived(self, line): # Look for end of headers if line == '': # Switch into raw mode to recieve data and callback the deferred # with the response instance. The data will be streamed as it # arrives. Callback the deferred and set self.response to None, # because there are no promises that the response will not be # mutated by a resource higher in the tree, such as # log.LogWrapperResource self.setRawMode() self.deferred.callback(self.response) self.response = None return # Split the header into name and value. The 'Status' header is handled # specially; all other headers are simply passed onto the response I'm # building. name, value = line.split(':',1) value = value.strip() if name.lower() == 'status': value = value.split(None,1)[0] self.response.code = int(value) else: self.response.headers.addRawHeader(name, value) def rawDataReceived(self, data): self.stream.write(data) def connectionLost(self, reason): # The connection is closed and all data has been streamed via the # response. Tell the response stream it's over. self.stream.finish() class SCGIClientProtocolFactory(protocol.ClientFactory): """SCGI client protocol factory. I am created by a SCGIClientResource to connect to an SCGI server. When I connect I create a SCGIClientProtocol instance to do all the talking with the server. The ``deferred`` attribute is passed on to the protocol and is fired with the HTTP response from the server once it has been recieved. """ protocol = SCGIClientProtocol noisy = False # Make Factory shut up def __init__(self, request): self.request = request self.deferred = defer.Deferred() def buildProtocol(self, addr): return self.protocol(self.request, self.deferred) def clientConnectionFailed(self, connector, reason): self.sendFailureResponse(reason) def sendFailureResponse(self, reason): response = http.Response(code=responsecode.BAD_GATEWAY, stream=str(reason.value)) self.deferred.callback(response) __all__ = ['SCGIClientResource'] TwistedWeb2-8.1.0/twisted/web2/vhost.py0000644000175000017500000001710510456304373016405 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_vhost -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """I am a virtual hosts implementation. """ # System Imports import urlparse from zope.interface import implements import urllib import warnings from twisted.internet import address from twisted.python import log # Sibling Imports from twisted.web2 import resource from twisted.web2 import responsecode from twisted.web2 import iweb from twisted.web2 import http class NameVirtualHost(resource.Resource): """Resource in charge of dispatching requests to other resources based on the value of the HTTP 'Host' header. @param supportNested: If True domain segments will be chopped off until the TLD is reached or a matching virtual host is found. (In which case the child resource can do its own more specific virtual host lookup.) """ supportNested = True def __init__(self, default=None): """ @param default: The default resource to be served when encountering an unknown hostname. @type default: L{twisted.web2.iweb.IResource} or C{None} """ resource.Resource.__init__(self) self.hosts = {} self.default = default def addHost(self, name, resrc): """Add a host to this virtual host. - The Fun Stuff(TM) This associates a host named 'name' with a resource 'resrc':: nvh.addHost('nevow.com', nevowDirectory) nvh.addHost('divmod.org', divmodDirectory) nvh.addHost('twistedmatrix.com', twistedMatrixDirectory) I told you that was fun. @param name: The FQDN to be matched to the 'Host' header. @type name: C{str} @param resrc: The L{twisted.web2.iweb.IResource} to be served as the given hostname. @type resource: L{twisted.web2.iweb.IResource} """ self.hosts[name] = resrc def removeHost(self, name): """Remove the given host. @param name: The FQDN to remove. @type name: C{str} """ del self.hosts[name] def locateChild(self, req, segments): """It's a NameVirtualHost, do you know where your children are? This uses locateChild magic so you don't have to mutate the request. """ host = req.host.lower() if self.supportNested: while not self.hosts.has_key(host) and len(host.split('.')) > 1: host = '.'.join(host.split('.')[1:]) # Default being None is okay, it'll turn into a 404 return self.hosts.get(host, self.default), segments class AutoVHostURIRewrite(object): """ I do request mangling to insure that children know what host they are being accessed from behind apache2. Usage: - Twisted:: root = MyResource() vur = vhost.AutoVHostURIRewrite(root) - Apache2:: ProxyPass http://localhost:8538/ RequestHeader set X-App-Location /whatever/ If the trailing / is ommitted in the second argument to ProxyPass VHostURIRewrite will return a 404 response code. If proxying HTTPS, add this to the Apache config:: RequestHeader set X-App-Scheme https """ implements(iweb.IResource) def __init__(self, resource, sendsRealHost=False): """ @param resource: The resource to serve after mutating the request. @type resource: L{twisted.web2.iweb.IResource} @param sendsRealHost: If True then the proxy will be expected to send the HTTP 'Host' header that was sent by the requesting client. @type sendsRealHost: C{bool} """ self.resource=resource self.sendsRealHost = sendsRealHost def renderHTTP(self, req): return http.Response(responsecode.NOT_FOUND) def locateChild(self, req, segments): scheme = req.headers.getRawHeaders('x-app-scheme') if self.sendsRealHost: host = req.headers.getRawHeaders('host') else: host = req.headers.getRawHeaders('x-forwarded-host') app_location = req.headers.getRawHeaders('x-app-location') remote_ip = req.headers.getRawHeaders('x-forwarded-for') if not (host and remote_ip): if not host: warnings.warn( ("No host was obtained either from Host or " "X-Forwarded-Host headers. If your proxy does not " "send either of these headers use VHostURIRewrite. " "If your proxy sends the real host as the Host header " "use " "AutoVHostURIRewrite(resrc, sendsRealHost=True)")) # some header unspecified => Error raise http.HTTPError(responsecode.BAD_REQUEST) host = host[0] remote_ip = remote_ip[0] if app_location: app_location = app_location[0] else: app_location = '/' if scheme: scheme = scheme[0] else: scheme='http' req.host, req.port = http.splitHostPort(scheme, host) req.scheme = scheme req.remoteAddr = address.IPv4Address('TCP', remote_ip, 0) req.prepath = app_location[1:].split('/')[:-1] req.path = '/'+('/'.join([urllib.quote(s, '') for s in (req.prepath + segments)])) return self.resource, segments class VHostURIRewrite(object): """ I do request mangling to insure that children know what host they are being accessed from behind mod_proxy. Usage: - Twisted:: root = MyResource() vur = vhost.VHostURIRewrite(uri='http://hostname:port/path', resource=root) server.Site(vur) - Apache:: ProxyPass /path/ http://localhost:8080/ Servername hostname If the trailing / is ommitted in the second argument to ProxyPass VHostURIRewrite will return a 404 response code. uri must be a fully specified uri complete with scheme://hostname/path/ """ implements(iweb.IResource) def __init__(self, uri, resource): """ @param uri: The URI to be used for mutating the request. This MUST include scheme://hostname/path. @type uri: C{str} @param resource: The resource to serve after mutating the request. @type resource: L{twisted.web2.iweb.IResource} """ self.resource = resource (self.scheme, self.host, self.path, params, querystring, fragment) = urlparse.urlparse(uri) if params or querystring or fragment: raise ValueError("Must not specify params, query args, or fragment to VHostURIRewrite") self.path = map(urllib.unquote, self.path[1:].split('/'))[:-1] self.host, self.port = http.splitHostPort(self.scheme, self.host) def renderHTTP(self, req): return http.Response(responsecode.NOT_FOUND) def locateChild(self, req, segments): req.scheme = self.scheme req.host = self.host req.port = self.port req.prepath=self.path[:] req.path = '/'+('/'.join([urllib.quote(s, '') for s in (req.prepath + segments)])) # print req.prepath, segments, req.postpath, req.path return self.resource, segments __all__ = ['VHostURIRewrite', 'AutoVHostURIRewrite', 'NameVirtualHost'] TwistedWeb2-8.1.0/twisted/web2/xmlrpc.py0000644000175000017500000001650610763340326016552 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_xmlrpc -*- # # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """ A generic resource for publishing objects via XML-RPC. Maintainer: U{Itamar Shtull-Trauring} """ # System Imports import xmlrpclib # Sibling Imports from twisted.web2 import resource, stream from twisted.web2 import responsecode, http, http_headers from twisted.internet import defer from twisted.python import log, reflect # Useful so people don't need to import xmlrpclib directly Fault = xmlrpclib.Fault Binary = xmlrpclib.Binary Boolean = xmlrpclib.Boolean DateTime = xmlrpclib.DateTime class NoSuchFunction(Fault): """There is no function by the given name.""" pass class XMLRPC(resource.Resource): """A resource that implements XML-RPC. You probably want to connect this to '/RPC2'. Methods published can return XML-RPC serializable results, Faults, Binary, Boolean, DateTime, Deferreds, or Handler instances. By default methods beginning with 'xmlrpc_' are published. Sub-handlers for prefixed methods (e.g., system.listMethods) can be added with putSubHandler. By default, prefixes are separated with a '.'. Override self.separator to change this. """ # Error codes for Twisted, if they conflict with yours then # modify them at runtime. NOT_FOUND = 8001 FAILURE = 8002 separator = '.' def __init__(self): resource.Resource.__init__(self) self.subHandlers = {} def putSubHandler(self, prefix, handler): self.subHandlers[prefix] = handler def getSubHandler(self, prefix): return self.subHandlers.get(prefix, None) def getSubHandlerPrefixes(self): return self.subHandlers.keys() def render(self, request): # For GET/HEAD: Return an error message s=("XML-RPC responder" "

XML-RPC responder

POST your XML-RPC here.") return http.Response(responsecode.OK, {'content-type': http_headers.MimeType('text', 'html')}, s) def http_POST(self, request): parser, unmarshaller = xmlrpclib.getparser() deferred = stream.readStream(request.stream, parser.feed) deferred.addCallback(lambda x: self._cbDispatch( request, parser, unmarshaller)) deferred.addErrback(self._ebRender) deferred.addCallback(self._cbRender, request) return deferred def _cbDispatch(self, request, parser, unmarshaller): parser.close() args, functionPath = unmarshaller.close(), unmarshaller.getmethodname() function = self.getFunction(functionPath) return defer.maybeDeferred(function, request, *args) def _cbRender(self, result, request): if not isinstance(result, Fault): result = (result,) try: s = xmlrpclib.dumps(result, methodresponse=1) except: f = Fault(self.FAILURE, "can't serialize output") s = xmlrpclib.dumps(f, methodresponse=1) return http.Response(responsecode.OK, {'content-type': http_headers.MimeType('text', 'xml')}, s) def _ebRender(self, failure): if isinstance(failure.value, Fault): return failure.value log.err(failure) return Fault(self.FAILURE, "error") def getFunction(self, functionPath): """Given a string, return a function, or raise NoSuchFunction. This returned function will be called, and should return the result of the call, a Deferred, or a Fault instance. Override in subclasses if you want your own policy. The default policy is that given functionPath 'foo', return the method at self.xmlrpc_foo, i.e. getattr(self, "xmlrpc_" + functionPath). If functionPath contains self.separator, the sub-handler for the initial prefix is used to search for the remaining path. """ if functionPath.find(self.separator) != -1: prefix, functionPath = functionPath.split(self.separator, 1) handler = self.getSubHandler(prefix) if handler is None: raise NoSuchFunction(self.NOT_FOUND, "no such subHandler %s" % prefix) return handler.getFunction(functionPath) f = getattr(self, "xmlrpc_%s" % functionPath, None) if not f: raise NoSuchFunction(self.NOT_FOUND, "function %s not found" % functionPath) elif not callable(f): raise NoSuchFunction(self.NOT_FOUND, "function %s not callable" % functionPath) else: return f def _listFunctions(self): """Return a list of the names of all xmlrpc methods.""" return reflect.prefixedMethodNames(self.__class__, 'xmlrpc_') class XMLRPCIntrospection(XMLRPC): """Implement the XML-RPC Introspection API. By default, the methodHelp method returns the 'help' method attribute, if it exists, otherwise the __doc__ method attribute, if it exists, otherwise the empty string. To enable the methodSignature method, add a 'signature' method attribute containing a list of lists. See methodSignature's documentation for the format. Note the type strings should be XML-RPC types, not Python types. """ def __init__(self, parent): """Implement Introspection support for an XMLRPC server. @param parent: the XMLRPC server to add Introspection support to. """ XMLRPC.__init__(self) self._xmlrpc_parent = parent def xmlrpc_listMethods(self, request): """Return a list of the method names implemented by this server.""" functions = [] todo = [(self._xmlrpc_parent, '')] while todo: obj, prefix = todo.pop(0) functions.extend([prefix + name for name in obj._listFunctions()]) todo.extend([(obj.getSubHandler(name), prefix + name + obj.separator) for name in obj.getSubHandlerPrefixes()]) functions.sort() return functions xmlrpc_listMethods.signature = [['array']] def xmlrpc_methodHelp(self, request, method): """Return a documentation string describing the use of the given method. """ method = self._xmlrpc_parent.getFunction(method) return (getattr(method, 'help', None) or getattr(method, '__doc__', None) or '') xmlrpc_methodHelp.signature = [['string', 'string']] def xmlrpc_methodSignature(self, request, method): """Return a list of type signatures. Each type signature is a list of the form [rtype, type1, type2, ...] where rtype is the return type and typeN is the type of the Nth argument. If no signature information is available, the empty string is returned. """ method = self._xmlrpc_parent.getFunction(method) return getattr(method, 'signature', None) or '' xmlrpc_methodSignature.signature = [['array', 'string'], ['string', 'string']] def addIntrospection(xmlrpc): """Add Introspection support to an XMLRPC server. @param xmlrpc: The xmlrpc server to add Introspection support to. """ xmlrpc.putSubHandler('system', XMLRPCIntrospection(xmlrpc)) __all__ = ["XMLRPC", "NoSuchFunction", "Fault"] TwistedWeb2-8.1.0/twisted/web2/server.py0000644000175000017500000005136210713026424016545 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_server -*- # Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ This is a web-server which integrates with the twisted.internet infrastructure. """ # System Imports import cgi, time, urlparse from urllib import quote, unquote from urlparse import urlsplit import weakref from zope.interface import implements # Twisted Imports from twisted.internet import defer from twisted.python import log, failure # Sibling Imports from twisted.web2 import http, iweb, fileupload, responsecode from twisted.web2 import http_headers from twisted.web2.filter.range import rangefilter from twisted.web2 import error from twisted.web2 import version as web2_version from twisted import __version__ as twisted_version VERSION = "Twisted/%s TwistedWeb/%s" % (twisted_version, web2_version) _errorMarker = object() def defaultHeadersFilter(request, response): if not response.headers.hasHeader('server'): response.headers.setHeader('server', VERSION) if not response.headers.hasHeader('date'): response.headers.setHeader('date', time.time()) return response defaultHeadersFilter.handleErrors = True def preconditionfilter(request, response): if request.method in ("GET", "HEAD"): http.checkPreconditions(request, response) return response def doTrace(request): request = iweb.IRequest(request) txt = "%s %s HTTP/%d.%d\r\n" % (request.method, request.uri, request.clientproto[0], request.clientproto[1]) l=[] for name, valuelist in request.headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\r\n" % (name, value)) txt += ''.join(l) return http.Response( responsecode.OK, {'content-type': http_headers.MimeType('message', 'http')}, txt) def parsePOSTData(request, maxMem=100*1024, maxFields=1024, maxSize=10*1024*1024): """ Parse data of a POST request. @param request: the request to parse. @type request: L{twisted.web2.http.Request}. @param maxMem: maximum memory used during the parsing of the data. @type maxMem: C{int} @param maxFields: maximum number of form fields allowed. @type maxFields: C{int} @param maxSize: maximum size of file upload allowed. @type maxSize: C{int} @return: a deferred that will fire when the parsing is done. The deferred itself doesn't hold a return value, the request is modified directly. @rtype: C{defer.Deferred} """ if request.stream.length == 0: return defer.succeed(None) parser = None ctype = request.headers.getHeader('content-type') if ctype is None: return defer.succeed(None) def updateArgs(data): args = data request.args.update(args) def updateArgsAndFiles(data): args, files = data request.args.update(args) request.files.update(files) def error(f): f.trap(fileupload.MimeFormatError) raise http.HTTPError( http.StatusResponse(responsecode.BAD_REQUEST, str(f.value))) if (ctype.mediaType == 'application' and ctype.mediaSubtype == 'x-www-form-urlencoded'): d = fileupload.parse_urlencoded(request.stream) d.addCallbacks(updateArgs, error) return d elif (ctype.mediaType == 'multipart' and ctype.mediaSubtype == 'form-data'): boundary = ctype.params.get('boundary') if boundary is None: return defer.fail(http.HTTPError( http.StatusResponse( responsecode.BAD_REQUEST, "Boundary not specified in Content-Type."))) d = fileupload.parseMultipartFormData(request.stream, boundary, maxMem, maxFields, maxSize) d.addCallbacks(updateArgsAndFiles, error) return d else: return defer.fail(http.HTTPError( http.StatusResponse( responsecode.BAD_REQUEST, "Invalid content-type: %s/%s" % ( ctype.mediaType, ctype.mediaSubtype)))) class StopTraversal(object): """ Indicates to Request._handleSegment that it should stop handling path segments. """ pass class Request(http.Request): """ vars: site remoteAddr scheme host port path params querystring args files prepath postpath @ivar path: The path only (arguments not included). @ivar args: All of the arguments, including URL and POST arguments. @type args: A mapping of strings (the argument names) to lists of values. i.e., ?foo=bar&foo=baz&quux=spam results in {'foo': ['bar', 'baz'], 'quux': ['spam']}. """ implements(iweb.IRequest) site = None _initialprepath = None responseFilters = [rangefilter, preconditionfilter, error.defaultErrorHandler, defaultHeadersFilter] def __init__(self, *args, **kw): if kw.has_key('site'): self.site = kw['site'] del kw['site'] if kw.has_key('prepathuri'): self._initialprepath = kw['prepathuri'] del kw['prepathuri'] # Copy response filters from the class self.responseFilters = self.responseFilters[:] self.files = {} self.resources = [] http.Request.__init__(self, *args, **kw) def addResponseFilter(self, f, atEnd=False): if atEnd: self.responseFilters.append(f) else: self.responseFilters.insert(0, f) def unparseURL(self, scheme=None, host=None, port=None, path=None, params=None, querystring=None, fragment=None): """Turn the request path into a url string. For any pieces of the url that are not specified, use the value from the request. The arguments have the same meaning as the same named attributes of Request.""" if scheme is None: scheme = self.scheme if host is None: host = self.host if port is None: port = self.port if path is None: path = self.path if params is None: params = self.params if querystring is None: query = self.querystring if fragment is None: fragment = '' if port == http.defaultPortForScheme.get(scheme, 0): hostport = host else: hostport = host + ':' + str(port) return urlparse.urlunparse(( scheme, hostport, path, params, querystring, fragment)) def _parseURL(self): if self.uri[0] == '/': # Can't use urlparse for request_uri because urlparse # wants to be given an absolute or relative URI, not just # an abs_path, and thus gets '//foo' wrong. self.scheme = self.host = self.path = self.params = self.querystring = '' if '?' in self.uri: self.path, self.querystring = self.uri.split('?', 1) else: self.path = self.uri if ';' in self.path: self.path, self.params = self.path.split(';', 1) else: # It is an absolute uri, use standard urlparse (self.scheme, self.host, self.path, self.params, self.querystring, fragment) = urlparse.urlparse(self.uri) if self.querystring: self.args = cgi.parse_qs(self.querystring, True) else: self.args = {} path = map(unquote, self.path[1:].split('/')) if self._initialprepath: # We were given an initial prepath -- this is for supporting # CGI-ish applications where part of the path has already # been processed prepath = map(unquote, self._initialprepath[1:].split('/')) if path[:len(prepath)] == prepath: self.prepath = prepath self.postpath = path[len(prepath):] else: self.prepath = [] self.postpath = path else: self.prepath = [] self.postpath = path #print "_parseURL", self.uri, (self.uri, self.scheme, self.host, self.path, self.params, self.querystring) def _fixupURLParts(self): hostaddr, secure = self.chanRequest.getHostInfo() if not self.scheme: self.scheme = ('http', 'https')[secure] if self.host: self.host, self.port = http.splitHostPort(self.scheme, self.host) else: # If GET line wasn't an absolute URL host = self.headers.getHeader('host') if host: self.host, self.port = http.splitHostPort(self.scheme, host) else: # When no hostname specified anywhere, either raise an # error, or use the interface hostname, depending on # protocol version if self.clientproto >= (1,1): raise http.HTTPError(responsecode.BAD_REQUEST) self.host = hostaddr.host self.port = hostaddr.port def process(self): "Process a request." try: self.checkExpect() resp = self.preprocessRequest() if resp is not None: self._cbFinishRender(resp).addErrback(self._processingFailed) return self._parseURL() self._fixupURLParts() self.remoteAddr = self.chanRequest.getRemoteHost() except: failedDeferred = self._processingFailed(failure.Failure()) return d = defer.Deferred() d.addCallback(self._getChild, self.site.resource, self.postpath) d.addCallback(lambda res, req: res.renderHTTP(req), self) d.addCallback(self._cbFinishRender) d.addErrback(self._processingFailed) d.callback(None) def preprocessRequest(self): """Do any request processing that doesn't follow the normal resource lookup procedure. "OPTIONS *" is handled here, for example. This would also be the place to do any CONNECT processing.""" if self.method == "OPTIONS" and self.uri == "*": response = http.Response(responsecode.OK) response.headers.setHeader('allow', ('GET', 'HEAD', 'OPTIONS', 'TRACE')) return response # This is where CONNECT would go if we wanted it return None def _getChild(self, _, res, path, updatepaths=True): """Call res.locateChild, and pass the result on to _handleSegment.""" self.resources.append(res) if not path: return res result = res.locateChild(self, path) if isinstance(result, defer.Deferred): return result.addCallback(self._handleSegment, res, path, updatepaths) else: return self._handleSegment(result, res, path, updatepaths) def _handleSegment(self, result, res, path, updatepaths): """Handle the result of a locateChild call done in _getChild.""" newres, newpath = result # If the child resource is None then display a error page if newres is None: raise http.HTTPError(responsecode.NOT_FOUND) # If we got a deferred then we need to call back later, once the # child is actually available. if isinstance(newres, defer.Deferred): return newres.addCallback( lambda actualRes: self._handleSegment( (actualRes, newpath), res, path, updatepaths) ) if path: url = quote("/" + "/".join(path)) else: url = "/" if newpath is StopTraversal: # We need to rethink how to do this. #if newres is res: self._rememberResource(res, url) return res #else: # raise ValueError("locateChild must not return StopTraversal with a resource other than self.") newres = iweb.IResource(newres) if newres is res: assert not newpath is path, "URL traversal cycle detected when attempting to locateChild %r from resource %r." % (path, res) assert len(newpath) < len(path), "Infinite loop impending..." if updatepaths: # We found a Resource... update the request.prepath and postpath for x in xrange(len(path) - len(newpath)): self.prepath.append(self.postpath.pop(0)) child = self._getChild(None, newres, newpath, updatepaths=updatepaths) self._rememberResource(child, url) return child _urlsByResource = weakref.WeakKeyDictionary() def _rememberResource(self, resource, url): """ Remember the URL of a visited resource. """ self._urlsByResource[resource] = url return resource def urlForResource(self, resource): """ Looks up the URL of the given resource if this resource was found while processing this request. Specifically, this includes the requested resource, and resources looked up via L{locateResource}. Note that a resource may be found at multiple URIs; if the same resource is visited at more than one location while processing this request, this method will return one of those URLs, but which one is not defined, nor whether the same URL is returned in subsequent calls. @param resource: the resource to find a URI for. This resource must have been obtained from the request (ie. via its C{uri} attribute, or through its C{locateResource} or C{locateChildResource} methods). @return: a valid URL for C{resource} in this request. @raise NoURLForResourceError: if C{resource} has no URL in this request (because it was not obtained from the request). """ resource = self._urlsByResource.get(resource, None) if resource is None: raise NoURLForResourceError(resource) return resource def locateResource(self, url): """ Looks up the resource with the given URL. @param uri: The URL of the desired resource. @return: a L{Deferred} resulting in the L{IResource} at the given URL or C{None} if no such resource can be located. @raise HTTPError: If C{url} is not a URL on the site that this request is being applied to. The contained response will have a status code of L{responsecode.BAD_GATEWAY}. @raise HTTPError: If C{url} contains a query or fragment. The contained response will have a status code of L{responsecode.BAD_REQUEST}. """ if url is None: return None # # Parse the URL # (scheme, host, path, query, fragment) = urlsplit(url) if query or fragment: raise http.HTTPError(http.StatusResponse( responsecode.BAD_REQUEST, "URL may not contain a query or fragment: %s" % (url,) )) # The caller shouldn't be asking a request on one server to lookup a # resource on some other server. if (scheme and scheme != self.scheme) or (host and host != self.headers.getHeader("host")): raise http.HTTPError(http.StatusResponse( responsecode.BAD_GATEWAY, "URL is not on this site (%s://%s/): %s" % (scheme, self.headers.getHeader("host"), url) )) segments = path.split("/") assert segments[0] == "", "URL path didn't begin with '/': %s" % (path,) segments = map(unquote, segments[1:]) def notFound(f): f.trap(http.HTTPError) if f.value.response.code != responsecode.NOT_FOUND: return f return None d = defer.maybeDeferred(self._getChild, None, self.site.resource, segments, updatepaths=False) d.addCallback(self._rememberResource, path) d.addErrback(notFound) return d def locateChildResource(self, parent, childName): """ Looks up the child resource with the given name given the parent resource. This is similar to locateResource(), but doesn't have to start the lookup from the root resource, so it is potentially faster. @param parent: the parent of the resource being looked up. This resource must have been obtained from the request (ie. via its C{uri} attribute, or through its C{locateResource} or C{locateChildResource} methods). @param childName: the name of the child of C{parent} to looked up. to C{parent}. @return: a L{Deferred} resulting in the L{IResource} at the given URL or C{None} if no such resource can be located. @raise NoURLForResourceError: if C{resource} was not obtained from the request. """ if parent is None or childName is None: return None assert "/" not in childName, "Child name may not contain '/': %s" % (childName,) parentURL = self.urlForResource(parent) if not parentURL.endswith("/"): parentURL += "/" url = parentURL + quote(childName) segment = childName def notFound(f): f.trap(http.HTTPError) if f.value.response.code != responsecode.NOT_FOUND: return f return None d = defer.maybeDeferred(self._getChild, None, parent, [segment], updatepaths=False) d.addCallback(self._rememberResource, url) d.addErrback(notFound) return d def _processingFailed(self, reason): if reason.check(http.HTTPError) is not None: # If the exception was an HTTPError, leave it alone d = defer.succeed(reason.value.response) else: # Otherwise, it was a random exception, so give a # ICanHandleException implementer a chance to render the page. def _processingFailed_inner(reason): handler = iweb.ICanHandleException(self, self) return handler.renderHTTP_exception(self, reason) d = defer.maybeDeferred(_processingFailed_inner, reason) d.addCallback(self._cbFinishRender) d.addErrback(self._processingReallyFailed, reason) return d def _processingReallyFailed(self, reason, origReason): log.msg("Exception rendering error page:", isErr=1) log.err(reason) log.msg("Original exception:", isErr=1) log.err(origReason) body = ("Internal Server Error" "

Internal Server Error

An error occurred rendering the requested page. Additionally, an error occured rendering the error page.") response = http.Response( responsecode.INTERNAL_SERVER_ERROR, {'content-type': http_headers.MimeType('text','html')}, body) self.writeResponse(response) def _cbFinishRender(self, result): def filterit(response, f): if (hasattr(f, 'handleErrors') or (response.code >= 200 and response.code < 300 and response.code != 204)): return f(self, response) else: return response response = iweb.IResponse(result, None) if response: d = defer.Deferred() for f in self.responseFilters: d.addCallback(filterit, f) d.addCallback(self.writeResponse) d.callback(response) return d resource = iweb.IResource(result, None) if resource: self.resources.append(resource) d = defer.maybeDeferred(resource.renderHTTP, self) d.addCallback(self._cbFinishRender) return d raise TypeError("html is not a resource or a response") def renderHTTP_exception(self, req, reason): log.msg("Exception rendering:", isErr=1) log.err(reason) body = ("Internal Server Error" "

Internal Server Error

An error occurred rendering the requested page. More information is available in the server log.") return http.Response( responsecode.INTERNAL_SERVER_ERROR, {'content-type': http_headers.MimeType('text','html')}, body) class Site(object): def __init__(self, resource): """Initialize. """ self.resource = iweb.IResource(resource) def __call__(self, *args, **kwargs): return Request(site=self, *args, **kwargs) class NoURLForResourceError(RuntimeError): def __init__(self, resource): RuntimeError.__init__(self, "Resource %r has no URL in this request." % (resource,)) self.resource = resource __all__ = ['Request', 'Site', 'StopTraversal', 'VERSION', 'defaultHeadersFilter', 'doTrace', 'parsePOSTData', 'preconditionfilter', 'NoURLForResourceError'] TwistedWeb2-8.1.0/twisted/web2/wsgi.py0000644000175000017500000002561110756322353016215 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_wsgi -*- # Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ An implementation of PEP 333: Python Web Server Gateway Interface (WSGI). """ import os, threading from zope.interface import implements from twisted.internet import defer, threads from twisted.internet import reactor from twisted.python import log, failure from twisted.web2 import http from twisted.web2 import iweb from twisted.web2 import server from twisted.web2 import stream from twisted.web2.twcgi import createCGIEnvironment class AlreadyStartedResponse(Exception): pass # This isn't a subclass of resource.Resource, because it shouldn't do # any method-specific actions at all. All that stuff is totally up to # the contained wsgi application class WSGIResource(object): """ A web2 Resource which wraps the given WSGI application callable. The WSGI application will be called in a separate thread (using the reactor threadpool) whenever a request for this resource or any lower part of the url hierarchy is received. """ implements(iweb.IResource) def __init__(self, application): self.application = application def renderHTTP(self, req): # Do stuff with WSGIHandler. handler = WSGIHandler(self.application, req) # Get deferred d = handler.responseDeferred # Run it in a thread reactor.callInThread(handler.run) return d def locateChild(self, request, segments): return self, server.StopTraversal class InputStream(object): """ This class implements the 'wsgi.input' object. The methods are expected to have the same behavior as the same-named methods for python's builtin file object. """ def __init__(self, newstream): # Called in IO thread self.stream = stream.BufferedStream(newstream) def read(self, size=None): """ Read at most size bytes from the input, or less if EOF is encountered. If size is ommitted or negative, read until EOF. """ # Called in application thread if size < 0: size = None return threads.blockingCallFromThread( reactor, self.stream.readExactly, size) def readline(self, size=None): """ Read a line, delimited by a newline. If the stream reaches EOF or size bytes have been read before reaching a newline (if size is given), the partial line is returned. COMPATIBILITY NOTE: the size argument is excluded from the WSGI specification, but is provided here anyhow, because useful libraries such as python stdlib's cgi.py assume their input file-like-object supports readline with a size argument. If you use it, be aware your application may not be portable to other conformant WSGI servers. """ # Called in application thread if size < 0: # E.g. -1, which is the default readline size for *some* # other file-like-objects... size = None return threads.blockingCallFromThread( reactor, self.stream.readline, '\n', size=size) def readlines(self, hint=None): """ Read until EOF, collecting all lines in a list, and returns that list. The hint argument is ignored (as is allowed in the API specification) """ # Called in application thread data = self.read() lines = data.split('\n') last = lines.pop() lines = [s+'\n' for s in lines] if last != '': lines.append(last) return lines def __iter__(self): """ Returns an iterator, each iteration of which returns the result of readline(), and stops when readline() returns an empty string. """ while 1: line = self.readline() if not line: return yield line class ErrorStream(object): """ This class implements the 'wsgi.error' object. """ def flush(self): # Called in application thread return def write(self, s): # Called in application thread log.msg("WSGI app error: "+s, isError=True) def writelines(self, seq): # Called in application thread s = ''.join(seq) log.msg("WSGI app error: "+s, isError=True) class WSGIHandler(object): headersSent = False stopped = False stream = None def __init__(self, application, request): # Called in IO thread self.setupEnvironment(request) self.application = application self.request = request self.response = None self.responseDeferred = defer.Deferred() def setupEnvironment(self, request): # Called in IO thread env = createCGIEnvironment(request) env['wsgi.version'] = (1, 0) env['wsgi.url_scheme'] = env['REQUEST_SCHEME'] env['wsgi.input'] = InputStream(request.stream) env['wsgi.errors'] = ErrorStream() env['wsgi.multithread'] = True env['wsgi.multiprocess'] = False env['wsgi.run_once'] = False env['wsgi.file_wrapper'] = FileWrapper self.environment = env def startWSGIResponse(self, status, response_headers, exc_info=None): # Called in application thread if exc_info is not None: try: if self.headersSent: raise exc_info[0], exc_info[1], exc_info[2] finally: exc_info = None elif self.response is not None: raise AlreadyStartedResponse, 'startWSGIResponse(%r)' % status status = int(status.split(' ')[0]) self.response = http.Response(status) for key, value in response_headers: self.response.headers.addRawHeader(key, value) return self.write def run(self): # Called in application thread try: result = self.application(self.environment, self.startWSGIResponse) self.handleResult(result) except: if not self.headersSent: reactor.callFromThread(self.__error, failure.Failure()) else: reactor.callFromThread(self.stream.finish, failure.Failure()) def __callback(self): # Called in IO thread self.responseDeferred.callback(self.response) self.responseDeferred = None def __error(self, f): # Called in IO thread self.responseDeferred.errback(f) self.responseDeferred = None def write(self, output): # Called in application thread if self.response is None: raise RuntimeError( "Application didn't call startResponse before writing data!") if not self.headersSent: self.stream=self.response.stream=stream.ProducerStream() self.headersSent = True # threadsafe event object to communicate paused state. self.unpaused = threading.Event() # After this, we cannot touch self.response from this # thread any more def _start(): # Called in IO thread self.stream.registerProducer(self, True) self.__callback() # Notify application thread to start writing self.unpaused.set() reactor.callFromThread(_start) # Wait for unpaused to be true self.unpaused.wait() reactor.callFromThread(self.stream.write, output) def writeAll(self, result): # Called in application thread if not self.headersSent: if self.response is None: raise RuntimeError( "Application didn't call startResponse before writing data!") l = 0 for item in result: l += len(item) self.response.stream=stream.ProducerStream(length=l) self.response.stream.buffer = list(result) self.response.stream.finish() reactor.callFromThread(self.__callback) else: # Has already been started, cannot replace the stream def _write(): # Called in IO thread for s in result: self.stream.write(s) self.stream.finish() reactor.callFromThread(_write) def handleResult(self, result): # Called in application thread try: if (isinstance(result, FileWrapper) and hasattr(result.filelike, 'fileno') and not self.headersSent): if self.response is None: raise RuntimeError( "Application didn't call startResponse before writing data!") self.headersSent = True # Make FileStream and output it. We make a new file # object from the fd, just in case the original one # isn't an actual file object. self.response.stream = stream.FileStream( os.fdopen(os.dup(result.filelike.fileno()))) reactor.callFromThread(self.__callback) return if type(result) in (list,tuple): # If it's a list or tuple (exactly, not subtype!), # then send the entire thing down to Twisted at once, # and free up this thread to do other work. self.writeAll(result) return # Otherwise, this thread has to keep running to provide the # data. for data in result: if self.stopped: return self.write(data) if not self.headersSent: if self.response is None: raise RuntimeError( "Application didn't call startResponse, and didn't send any data!") self.headersSent = True reactor.callFromThread(self.__callback) else: reactor.callFromThread(self.stream.finish) finally: if hasattr(result,'close'): result.close() def pauseProducing(self): # Called in IO thread self.unpaused.set() def resumeProducing(self): # Called in IO thread self.unpaused.clear() def stopProducing(self): self.stopped = True class FileWrapper(object): """ Wrapper to convert file-like objects to iterables, to implement the optional 'wsgi.file_wrapper' object. """ def __init__(self, filelike, blksize=8192): self.filelike = filelike self.blksize = blksize if hasattr(filelike,'close'): self.close = filelike.close def __iter__(self): return self def next(self): data = self.filelike.read(self.blksize) if data: return data raise StopIteration __all__ = ['WSGIResource'] TwistedWeb2-8.1.0/twisted/web2/dirlist.py0000644000175000017500000001004210402117232016670 0ustar dokodoko# Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """Directory listing.""" # system imports import os import urllib import stat import time # twisted imports from twisted.web2 import iweb, resource, http, http_headers def formatFileSize(size): if size < 1024: return '%i' % size elif size < (1024**2): return '%iK' % (size / 1024) elif size < (1024**3): return '%iM' % (size / (1024**2)) else: return '%iG' % (size / (1024**3)) class DirectoryLister(resource.Resource): def __init__(self, pathname, dirs=None, contentTypes={}, contentEncodings={}, defaultType='text/html'): self.contentTypes = contentTypes self.contentEncodings = contentEncodings self.defaultType = defaultType # dirs allows usage of the File to specify what gets listed self.dirs = dirs self.path = pathname resource.Resource.__init__(self) def data_listing(self, request, data): if self.dirs is None: directory = os.listdir(self.path) directory.sort() else: directory = self.dirs files = [] for path in directory: url = urllib.quote(path, '/') fullpath = os.path.join(self.path, path) try: st = os.stat(fullpath) except OSError: continue if stat.S_ISDIR(st.st_mode): url = url + '/' files.append({ 'link': url, 'linktext': path + "/", 'size': '', 'type': '-', 'lastmod': time.strftime("%Y-%b-%d %H:%M", time.localtime(st.st_mtime)) }) else: from twisted.web2.static import getTypeAndEncoding mimetype, encoding = getTypeAndEncoding( path, self.contentTypes, self.contentEncodings, self.defaultType) filesize = st.st_size files.append({ 'link': url, 'linktext': path, 'size': formatFileSize(filesize), 'type': mimetype, 'lastmod': time.strftime("%Y-%b-%d %H:%M", time.localtime(st.st_mtime)) }) return files def __repr__(self): return '' % self.path __str__ = __repr__ def render(self, request): title = "Directory listing for %s" % urllib.unquote(request.path) s= """%s

%s

""" % (title,title) s+="" s+="" even = False for row in self.data_listing(request, None): s+='' % (even and 'even' or 'odd',) s+='' % row even = not even s+="
FilenameSizeLast ModifiedFile Type
%(linktext)s%(size)s%(lastmod)s%(type)s
" response = http.Response(200, {}, s) response.headers.setHeader("content-type", http_headers.MimeType('text', 'html')) return response __all__ = ['DirectoryLister'] TwistedWeb2-8.1.0/twisted/web2/channel/0000755000175000017500000000000011014056216016263 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/channel/__init__.py0000644000175000017500000000067110261613314020401 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_cgi,twisted.web2.test.test_http -*- # See LICENSE for details. """ Various backend channel implementations for web2. """ from twisted.web2.channel.cgi import startCGI from twisted.web2.channel.scgi import SCGIFactory from twisted.web2.channel.http import HTTPFactory from twisted.web2.channel.fastcgi import FastCGIFactory __all__ = ['startCGI', 'SCGIFactory', 'HTTPFactory', 'FastCGIFactory'] TwistedWeb2-8.1.0/twisted/web2/channel/http.py0000644000175000017500000007765010546316055017644 0ustar dokodokoimport warnings import socket from cStringIO import StringIO from zope.interface import implements from twisted.python import log from twisted.internet import interfaces, protocol, reactor from twisted.protocols import policies, basic from twisted.web2 import responsecode from twisted.web2 import http_headers from twisted.web2 import http PERSIST_NO_PIPELINE, PERSIST_PIPELINE = (1,2) _cachedHostNames = {} def _cachedGetHostByAddr(hostaddr): hostname = _cachedHostNames.get(hostaddr) if hostname is None: try: hostname = socket.gethostbyaddr(hostaddr)[0] except socket.herror: hostname = hostaddr _cachedHostNames[hostaddr]=hostname return hostname class StringTransport(object): """ I am a StringIO wrapper that conforms for the transport API. I support the 'writeSequence' method. """ def __init__(self): self.s = StringIO() def writeSequence(self, seq): self.s.write(''.join(seq)) def __getattr__(self, attr): return getattr(self.__dict__['s'], attr) class AbortedException(Exception): pass class HTTPParser(object): """This class handles the parsing side of HTTP processing. With a suitable subclass, it can parse either the client side or the server side of the connection. """ # Class config: parseCloseAsEnd = False # Instance vars chunkedIn = False headerlen = 0 length = None inHeaders = None partialHeader = '' connHeaders = None finishedReading = False channel = None # For subclassing... # Needs attributes: # version # Needs functions: # createRequest() # processRequest() # _abortWithError() # handleContentChunk(data) # handleContentComplete() # Needs functions to exist on .channel # channel.maxHeaderLength # channel.requestReadFinished(self) # channel.setReadPersistent(self, persistent) # (from LineReceiver): # channel.setRawMode() # channel.setLineMode(extraneous) # channel.pauseProducing() # channel.resumeProducing() # channel.stopProducing() def __init__(self, channel): self.inHeaders = http_headers.Headers() self.channel = channel def lineReceived(self, line): if self.chunkedIn: # Parsing a chunked input if self.chunkedIn == 1: # First we get a line like "chunk-size [';' chunk-extension]" # (where chunk extension is just random crap as far as we're concerned) # RFC says to ignore any extensions you don't recognize -- that's all of them. chunksize = line.split(';', 1)[0] try: self.length = int(chunksize, 16) except: self._abortWithError(responsecode.BAD_REQUEST, "Invalid chunk size, not a hex number: %s!" % chunksize) if self.length < 0: self._abortWithError(responsecode.BAD_REQUEST, "Invalid chunk size, negative.") if self.length == 0: # We're done, parse the trailers line self.chunkedIn = 3 else: # Read self.length bytes of raw data self.channel.setRawMode() elif self.chunkedIn == 2: # After we got data bytes of the appropriate length, we end up here, # waiting for the CRLF, then go back to get the next chunk size. if line != '': self._abortWithError(responsecode.BAD_REQUEST, "Excess %d bytes sent in chunk transfer mode" % len(line)) self.chunkedIn = 1 elif self.chunkedIn == 3: # TODO: support Trailers (maybe! but maybe not!) # After getting the final "0" chunk we're here, and we *EAT MERCILESSLY* # any trailer headers sent, and wait for the blank line to terminate the # request. if line == '': self.allContentReceived() # END of chunk handling elif line == '': # Empty line => End of headers if self.partialHeader: self.headerReceived(self.partialHeader) self.partialHeader = '' self.allHeadersReceived() # can set chunkedIn self.createRequest() if self.chunkedIn: # stay in linemode waiting for chunk header pass elif self.length == 0: # no content expected self.allContentReceived() else: # await raw data as content self.channel.setRawMode() # Should I do self.pauseProducing() here? self.processRequest() else: self.headerlen += len(line) if self.headerlen > self.channel.maxHeaderLength: self._abortWithError(responsecode.BAD_REQUEST, 'Headers too long.') if line[0] in ' \t': # Append a header continuation self.partialHeader += line else: if self.partialHeader: self.headerReceived(self.partialHeader) self.partialHeader = line def rawDataReceived(self, data): """Handle incoming content.""" datalen = len(data) if datalen < self.length: self.handleContentChunk(data) self.length = self.length - datalen else: self.handleContentChunk(data[:self.length]) extraneous = data[self.length:] channel = self.channel # could go away from allContentReceived. if not self.chunkedIn: self.allContentReceived() else: # NOTE: in chunked mode, self.length is the size of the current chunk, # so we still have more to read. self.chunkedIn = 2 # Read next chunksize channel.setLineMode(extraneous) def headerReceived(self, line): """Store this header away. Check for too much header data (> channel.maxHeaderLength) and abort the connection if so. """ nameval = line.split(':', 1) if len(nameval) != 2: self._abortWithError(responsecode.BAD_REQUEST, "No ':' in header.") name, val = nameval val = val.lstrip(' \t') self.inHeaders.addRawHeader(name, val) def allHeadersReceived(self): # Split off connection-related headers connHeaders = self.splitConnectionHeaders() # Set connection parameters from headers self.setConnectionParams(connHeaders) self.connHeaders = connHeaders def allContentReceived(self): self.finishedReading = True self.channel.requestReadFinished(self) self.handleContentComplete() def splitConnectionHeaders(self): """ Split off connection control headers from normal headers. The normal headers are then passed on to user-level code, while the connection headers are stashed in .connHeaders and used for things like request/response framing. This corresponds roughly with the HTTP RFC's description of 'hop-by-hop' vs 'end-to-end' headers in RFC2616 S13.5.1, with the following exceptions: * proxy-authenticate and proxy-authorization are not treated as connection headers. * content-length is, as it is intimiately related with low-level HTTP parsing, and is made available to user-level code via the stream length, rather than a header value. (except for HEAD responses, in which case it is NOT used by low-level HTTP parsing, and IS kept in the normal headers. """ def move(name): h = inHeaders.getRawHeaders(name, None) if h is not None: inHeaders.removeHeader(name) connHeaders.setRawHeaders(name, h) # NOTE: According to HTTP spec, we're supposed to eat the # 'Proxy-Authenticate' and 'Proxy-Authorization' headers also, but that # doesn't sound like a good idea to me, because it makes it impossible # to have a non-authenticating transparent proxy in front of an # authenticating proxy. An authenticating proxy can eat them itself. # # 'Proxy-Connection' is an undocumented HTTP 1.0 abomination. connHeaderNames = ['content-length', 'connection', 'keep-alive', 'te', 'trailers', 'transfer-encoding', 'upgrade', 'proxy-connection'] inHeaders = self.inHeaders connHeaders = http_headers.Headers() move('connection') if self.version < (1,1): # Remove all headers mentioned in Connection, because a HTTP 1.0 # proxy might have erroneously forwarded it from a 1.1 client. for name in connHeaders.getHeader('connection', ()): if inHeaders.hasHeader(name): inHeaders.removeHeader(name) else: # Otherwise, just add the headers listed to the list of those to move connHeaderNames.extend(connHeaders.getHeader('connection', ())) # If the request was HEAD, self.length has been set to 0 by # HTTPClientRequest.submit; in this case, Content-Length should # be treated as a response header, not a connection header. # Note: this assumes the invariant that .length will always be None # coming into this function, unless this is a HEAD request. if self.length is not None: connHeaderNames.remove('content-length') for headername in connHeaderNames: move(headername) return connHeaders def setConnectionParams(self, connHeaders): # Figure out persistent connection stuff if self.version >= (1,1): if 'close' in connHeaders.getHeader('connection', ()): readPersistent = False else: readPersistent = PERSIST_PIPELINE elif 'keep-alive' in connHeaders.getHeader('connection', ()): readPersistent = PERSIST_NO_PIPELINE else: readPersistent = False # Okay, now implement section 4.4 Message Length to determine # how to find the end of the incoming HTTP message. transferEncoding = connHeaders.getHeader('transfer-encoding') if transferEncoding: if transferEncoding[-1] == 'chunked': # Chunked self.chunkedIn = 1 # Cut off the chunked encoding (cause it's special) transferEncoding = transferEncoding[:-1] elif not self.parseCloseAsEnd: # Would close on end of connection, except this can't happen for # client->server data. (Well..it could actually, since TCP has half-close # but the HTTP spec says it can't, so we'll pretend it's right.) self._abortWithError(responsecode.BAD_REQUEST, "Transfer-Encoding received without chunked in last position.") # TODO: support gzip/etc encodings. # FOR NOW: report an error if the client uses any encodings. # They shouldn't, because we didn't send a TE: header saying it's okay. if transferEncoding: self._abortWithError(responsecode.NOT_IMPLEMENTED, "Transfer-Encoding %s not supported." % transferEncoding) else: # No transfer-coding. self.chunkedIn = 0 if self.parseCloseAsEnd: # If no Content-Length, then it's indeterminate length data # (unless the responsecode was one of the special no body ones) # Also note that for HEAD requests, connHeaders won't have # content-length even if the response did. if self.code in http.NO_BODY_CODES: self.length = 0 else: self.length = connHeaders.getHeader('content-length', self.length) # If it's an indeterminate stream without transfer encoding, it must be # the last request. if self.length is None: readPersistent = False else: # If no Content-Length either, assume no content. self.length = connHeaders.getHeader('content-length', 0) # Set the calculated persistence self.channel.setReadPersistent(readPersistent) def abortParse(self): # If we're erroring out while still reading the request if not self.finishedReading: self.finishedReading = True self.channel.setReadPersistent(False) self.channel.requestReadFinished(self) # producer interface def pauseProducing(self): if not self.finishedReading: self.channel.pauseProducing() def resumeProducing(self): if not self.finishedReading: self.channel.resumeProducing() def stopProducing(self): if not self.finishedReading: self.channel.stopProducing() class HTTPChannelRequest(HTTPParser): """This class handles the state and parsing for one HTTP request. It is responsible for all the low-level connection oriented behavior. Thus, it takes care of keep-alive, de-chunking, etc., and passes the non-connection headers on to the user-level Request object.""" command = path = version = None queued = 0 request = None out_version = "HTTP/1.1" def __init__(self, channel, queued=0): HTTPParser.__init__(self, channel) self.queued=queued # Buffer writes to a string until we're first in line # to write a response if queued: self.transport = StringTransport() else: self.transport = self.channel.transport # set the version to a fallback for error generation self.version = (1,0) def gotInitialLine(self, initialLine): parts = initialLine.split() # Parse the initial request line if len(parts) != 3: if len(parts) == 1: parts.append('/') if len(parts) == 2 and parts[1][0] == '/': parts.append('HTTP/0.9') else: self._abortWithError(responsecode.BAD_REQUEST, 'Bad request line: %s' % initialLine) self.command, self.path, strversion = parts try: protovers = http.parseVersion(strversion) if protovers[0] != 'http': raise ValueError() except ValueError: self._abortWithError(responsecode.BAD_REQUEST, "Unknown protocol: %s" % strversion) self.version = protovers[1:3] # Ensure HTTP 0 or HTTP 1. if self.version[0] > 1: self._abortWithError(responsecode.HTTP_VERSION_NOT_SUPPORTED, 'Only HTTP 0.9 and HTTP 1.x are supported.') if self.version[0] == 0: # simulate end of headers, as HTTP 0 doesn't have headers. self.lineReceived('') def lineLengthExceeded(self, line, wasFirst=False): code = wasFirst and responsecode.REQUEST_URI_TOO_LONG or responsecode.BAD_REQUEST self._abortWithError(code, 'Header line too long.') def createRequest(self): self.request = self.channel.requestFactory(self, self.command, self.path, self.version, self.length, self.inHeaders) del self.inHeaders def processRequest(self): self.request.process() def handleContentChunk(self, data): self.request.handleContentChunk(data) def handleContentComplete(self): self.request.handleContentComplete() ############## HTTPChannelRequest *RESPONSE* methods ############# producer = None chunkedOut = False finished = False ##### Request Callbacks ##### def writeIntermediateResponse(self, code, headers=None): if self.version >= (1,1): self._writeHeaders(code, headers, False) def writeHeaders(self, code, headers): self._writeHeaders(code, headers, True) def _writeHeaders(self, code, headers, addConnectionHeaders): # HTTP 0.9 doesn't have headers. if self.version[0] == 0: return l = [] code_message = responsecode.RESPONSES.get(code, "Unknown Status") l.append('%s %s %s\r\n' % (self.out_version, code, code_message)) if headers is not None: for name, valuelist in headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\r\n" % (name, value)) if addConnectionHeaders: # if we don't have a content length, we send data in # chunked mode, so that we can support persistent connections. if (headers.getHeader('content-length') is None and self.command != "HEAD" and code not in http.NO_BODY_CODES): if self.version >= (1,1): l.append("%s: %s\r\n" % ('Transfer-Encoding', 'chunked')) self.chunkedOut = True else: # Cannot use persistent connections if we can't do chunking self.channel.dropQueuedRequests() if self.channel.isLastRequest(self): l.append("%s: %s\r\n" % ('Connection', 'close')) elif self.version < (1,1): l.append("%s: %s\r\n" % ('Connection', 'Keep-Alive')) l.append("\r\n") self.transport.writeSequence(l) def write(self, data): if not data: return elif self.chunkedOut: self.transport.writeSequence(("%X\r\n" % len(data), data, "\r\n")) else: self.transport.write(data) def finish(self): """We are finished writing data.""" if self.finished: warnings.warn("Warning! request.finish called twice.", stacklevel=2) return if self.chunkedOut: # write last chunk and closing CRLF self.transport.write("0\r\n\r\n") self.finished = True if not self.queued: self._cleanup() def abortConnection(self, closeWrite=True): """Abort the HTTP connection because of some kind of unrecoverable error. If closeWrite=False, then only abort reading, but leave the writing side alone. This is mostly for internal use by the HTTP request parsing logic, so that it can call an error page generator. Otherwise, completely shut down the connection. """ self.abortParse() if closeWrite: if self.producer: self.producer.stopProducing() self.unregisterProducer() self.finished = True if self.queued: self.transport.reset() self.transport.truncate() else: self._cleanup() def getHostInfo(self): t=self.channel.transport secure = interfaces.ISSLTransport(t, None) is not None host = t.getHost() host.host = _cachedGetHostByAddr(host.host) return host, secure def getRemoteHost(self): return self.channel.transport.getPeer() ##### End Request Callbacks ##### def _abortWithError(self, errorcode, text=''): """Handle low level protocol errors.""" headers = http_headers.Headers() headers.setHeader('content-length', len(text)+1) self.abortConnection(closeWrite=False) self.writeHeaders(errorcode, headers) self.write(text) self.write("\n") self.finish() raise AbortedException def _cleanup(self): """Called when have finished responding and are no longer queued.""" if self.producer: log.err(RuntimeError("Producer was not unregistered for %s" % self)) self.unregisterProducer() self.channel.requestWriteFinished(self) del self.transport # methods for channel - end users should not use these def noLongerQueued(self): """Notify the object that it is no longer queued. We start writing whatever data we have to the transport, etc. This method is not intended for users. """ if not self.queued: raise RuntimeError, "noLongerQueued() got called unnecessarily." self.queued = 0 # set transport to real one and send any buffer data data = self.transport.getvalue() self.transport = self.channel.transport if data: self.transport.write(data) # if we have producer, register it with transport if (self.producer is not None) and not self.finished: self.transport.registerProducer(self.producer, True) # if we're finished, clean up if self.finished: self._cleanup() # consumer interface def registerProducer(self, producer, streaming): """Register a producer. """ if self.producer: raise ValueError, "registering producer %s before previous one (%s) was unregistered" % (producer, self.producer) self.producer = producer if self.queued: producer.pauseProducing() else: self.transport.registerProducer(producer, streaming) def unregisterProducer(self): """Unregister the producer.""" if not self.queued: self.transport.unregisterProducer() self.producer = None def connectionLost(self, reason): """connection was lost""" if self.queued and self.producer: self.producer.stopProducing() self.producer = None if self.request: self.request.connectionLost(reason) class HTTPChannel(basic.LineReceiver, policies.TimeoutMixin, object): """A receiver for HTTP requests. Handles splitting up the connection for the multiple HTTPChannelRequests that may be in progress on this channel. @ivar timeOut: number of seconds to wait before terminating an idle connection. @ivar maxPipeline: number of outstanding in-progress requests to allow before pausing the input. @ivar maxHeaderLength: number of bytes of header to accept from the client. """ implements(interfaces.IHalfCloseableProtocol) ## Configuration parameters. Set in instances or subclasses. # How many simultaneous requests to handle. maxPipeline = 4 # Timeout when between two requests betweenRequestsTimeOut = 15 # Timeout between lines or bytes while reading a request inputTimeOut = 60 * 4 # maximum length of headers (10KiB) maxHeaderLength = 10240 # Allow persistent connections? allowPersistentConnections = True # ChannelRequest chanRequestFactory = HTTPChannelRequest requestFactory = http.Request _first_line = 2 readPersistent = PERSIST_PIPELINE _readLost = False _writeLost = False _lingerTimer = None chanRequest = None def _callLater(self, secs, fun): reactor.callLater(secs, fun) def __init__(self): # the request queue self.requests = [] def connectionMade(self): self.setTimeout(self.inputTimeOut) self.factory.outstandingRequests+=1 def lineReceived(self, line): if self._first_line: self.setTimeout(self.inputTimeOut) # if this connection is not persistent, drop any data which # the client (illegally) sent after the last request. if not self.readPersistent: self.dataReceived = self.lineReceived = lambda *args: None return # IE sends an extraneous empty line (\r\n) after a POST request; # eat up such a line, but only ONCE if not line and self._first_line == 1: self._first_line = 2 return self._first_line = 0 if not self.allowPersistentConnections: # Don't allow a second request self.readPersistent = False try: self.chanRequest = self.chanRequestFactory(self, len(self.requests)) self.requests.append(self.chanRequest) self.chanRequest.gotInitialLine(line) except AbortedException: pass else: try: self.chanRequest.lineReceived(line) except AbortedException: pass def lineLengthExceeded(self, line): if self._first_line: # Fabricate a request object to respond to the line length violation. self.chanRequest = self.chanRequestFactory(self, len(self.requests)) self.requests.append(self.chanRequest) self.chanRequest.gotInitialLine("GET fake HTTP/1.0") try: self.chanRequest.lineLengthExceeded(line, self._first_line) except AbortedException: pass def rawDataReceived(self, data): self.setTimeout(self.inputTimeOut) try: self.chanRequest.rawDataReceived(data) except AbortedException: pass def requestReadFinished(self, request): if(self.readPersistent is PERSIST_NO_PIPELINE or len(self.requests) >= self.maxPipeline): self.pauseProducing() # reset state variables self._first_line = 1 self.chanRequest = None self.setLineMode() # Disable the idle timeout, in case this request takes a long # time to finish generating output. if len(self.requests) > 0: self.setTimeout(None) def _startNextRequest(self): # notify next request, if present, it can start writing del self.requests[0] if self._writeLost: self.transport.loseConnection() elif self.requests: self.requests[0].noLongerQueued() # resume reading if allowed to if(not self._readLost and self.readPersistent is not PERSIST_NO_PIPELINE and len(self.requests) < self.maxPipeline): self.resumeProducing() elif self._readLost: # No more incoming data, they already closed! self.transport.loseConnection() else: # no requests in queue, resume reading self.setTimeout(self.betweenRequestsTimeOut) self.resumeProducing() def setReadPersistent(self, persistent): if self.readPersistent: # only allow it to be set if it's not currently False self.readPersistent = persistent def dropQueuedRequests(self): """Called when a response is written that forces a connection close.""" self.readPersistent = False # Tell all requests but first to abort. for request in self.requests[1:]: request.connectionLost(None) del self.requests[1:] def isLastRequest(self, request): # Is this channel handling the last possible request return not self.readPersistent and self.requests[-1] == request def requestWriteFinished(self, request): """Called by first request in queue when it is done.""" if request != self.requests[0]: raise TypeError # Don't del because we haven't finished cleanup, so, # don't want queue len to be 0 yet. self.requests[0] = None if self.readPersistent or len(self.requests) > 1: # Do this in the next reactor loop so as to # not cause huge call stacks with fast # incoming requests. self._callLater(0, self._startNextRequest) else: self.lingeringClose() def timeoutConnection(self): #log.msg("Timing out client: %s" % str(self.transport.getPeer())) policies.TimeoutMixin.timeoutConnection(self) def lingeringClose(self): """ This is a bit complicated. This process is necessary to ensure proper workingness when HTTP pipelining is in use. Here is what it wants to do: 1. Finish writing any buffered data, then close our write side. While doing so, read and discard any incoming data. 2. When that happens (writeConnectionLost called), wait up to 20 seconds for the remote end to close their write side (our read side). 3. - If they do (readConnectionLost called), close the socket, and cancel the timeout. - If that doesn't happen, the timer fires, and makes the socket close anyways. """ # Close write half self.transport.loseWriteConnection() # Throw out any incoming data self.dataReceived = self.lineReceived = lambda *args: None self.transport.resumeProducing() def writeConnectionLost(self): # Okay, all data has been written # In 20 seconds, actually close the socket self._lingerTimer = reactor.callLater(20, self._lingerClose) self._writeLost = True def _lingerClose(self): self._lingerTimer = None self.transport.loseConnection() def readConnectionLost(self): """Read connection lost""" # If in the lingering-close state, lose the socket. if self._lingerTimer: self._lingerTimer.cancel() self._lingerTimer = None self.transport.loseConnection() return # If between requests, drop connection # when all current requests have written their data. self._readLost = True if not self.requests: # No requests in progress, lose now. self.transport.loseConnection() # If currently in the process of reading a request, this is # probably a client abort, so lose the connection. if self.chanRequest: self.transport.loseConnection() def connectionLost(self, reason): self.factory.outstandingRequests-=1 self._writeLost = True self.readConnectionLost() self.setTimeout(None) # Tell all requests to abort. for request in self.requests: if request is not None: request.connectionLost(reason) class OverloadedServerProtocol(protocol.Protocol): def connectionMade(self): self.transport.write("HTTP/1.0 503 Service Unavailable\r\n" "Content-Type: text/html\r\n" "Connection: close\r\n\r\n" "503 Service Unavailable" "

Service Unavailable

" "The server is currently overloaded, " "please try again later.") self.transport.loseConnection() class HTTPFactory(protocol.ServerFactory): """Factory for HTTP server.""" protocol = HTTPChannel protocolArgs = None outstandingRequests = 0 def __init__(self, requestFactory, maxRequests=600, **kwargs): self.maxRequests=maxRequests self.protocolArgs = kwargs self.protocolArgs['requestFactory']=requestFactory def buildProtocol(self, addr): if self.outstandingRequests >= self.maxRequests: return OverloadedServerProtocol() p = protocol.ServerFactory.buildProtocol(self, addr) for arg,value in self.protocolArgs.iteritems(): setattr(p, arg, value) return p __all__ = ['HTTPFactory', ] TwistedWeb2-8.1.0/twisted/web2/channel/fastcgi.py0000644000175000017500000002764510457627450020311 0ustar dokodoko""" Twisted.web2 FastCGI backend support. """ """ Okay, FastCGI is a pretty stupid protocol. Let me count some reasons: 1) Specifies ability to multiplex streams of data over a single socket, but has no form of flow control. This is fine for multiplexing stderr, but serving more than one request over a channel with no flow control is just *asking* for trouble. I avoid this and enforce one outstanding request per connection. This basically means the whole "requestId" field is worthless. 2) Has variable length packet padding. If you want padding, just make it always pad to 8 bytes fercrissake! 3) Why does every packet need to specify the version. How about just sending it once. 4) Name/value pair format. Come *on*. Is it *possible* to come up with a more complex format to send them with?? Even if you think you've gotten it down, you probably forgot that it's a stream, and the name/values can be split between two packets. (Yes, this means *you*. Don't even try to pretend you didn't miss this detail.) """ from twisted.internet import protocol from twisted.web2 import responsecode from twisted.web2.channel import cgi class FastCGIError(Exception): pass # Values for type component of FCGI_Header FCGI_BEGIN_REQUEST = 1 FCGI_ABORT_REQUEST = 2 FCGI_END_REQUEST = 3 FCGI_PARAMS = 4 FCGI_STDIN = 5 FCGI_STDOUT = 6 FCGI_STDERR = 7 FCGI_DATA = 8 FCGI_GET_VALUES = 9 FCGI_GET_VALUES_RESULT = 10 FCGI_UNKNOWN_TYPE = 11 typeNames = { FCGI_BEGIN_REQUEST : 'fcgi_begin_request', FCGI_ABORT_REQUEST : 'fcgi_abort_request', FCGI_END_REQUEST : 'fcgi_end_request', FCGI_PARAMS : 'fcgi_params', FCGI_STDIN : 'fcgi_stdin', FCGI_STDOUT : 'fcgi_stdout', FCGI_STDERR : 'fcgi_stderr', FCGI_DATA : 'fcgi_data', FCGI_GET_VALUES : 'fcgi_get_values', FCGI_GET_VALUES_RESULT: 'fcgi_get_values_result', FCGI_UNKNOWN_TYPE : 'fcgi_unknown_type'} # Mask for flags component of FCGI_BeginRequestBody FCGI_KEEP_CONN = 1 # Values for role component of FCGI_BeginRequestBody FCGI_RESPONDER = 1 FCGI_AUTHORIZER = 2 FCGI_FILTER = 3 # Values for protocolStatus component of FCGI_EndRequestBody FCGI_REQUEST_COMPLETE = 0 FCGI_CANT_MPX_CONN = 1 FCGI_OVERLOADED = 2 FCGI_UNKNOWN_ROLE = 3 FCGI_MAX_PACKET_LEN = 0xFFFF class Record(object): def __init__(self, type, reqId, content, version=1): self.version = version self.type = type self.reqId = reqId self.content = content self.length = len(content) if self.length > FCGI_MAX_PACKET_LEN: raise ValueError("Record length too long: %d > %d" % (self.length, FCGI_MAX_PACKET_LEN)) self.padding = 8 - (self.length & 7) self.reserved = 0 def fromHeaderString(clz, rec): self = object.__new__(clz) self.version = ord(rec[0]) self.type = ord(rec[1]) self.reqId = (ord(rec[2])<<8)|ord(rec[3]) self.length = (ord(rec[4])<<8)|ord(rec[5]) self.padding = ord(rec[6]) self.reserved = ord(rec[7]) self.content = None return self fromHeaderString = classmethod(fromHeaderString) def toOutputString(self): return "%c%c%c%c%c%c%c%c" % ( self.version, self.type, (self.reqId&0xFF00)>>8, self.reqId&0xFF, (self.length&0xFF00)>>8, self.length & 0xFF, self.padding, self.reserved) + self.content + '\0'*self.padding def totalLength(self): return 8 + self.length + self.padding def __repr__(self): return "" % ( self.version, self.type, typeNames.get(self.type), self.reqId, self.content) def parseNameValues(s): ''' @param s: String containing valid name/value data, of the form: 'namelength + valuelength + name + value' repeated 0 or more times. See C{fastcgi.writeNameValue} for how to create this string. @return: Generator of tuples of the form (name, value) ''' off = 0 while off < len(s): nameLen = ord(s[off]) off += 1 if nameLen&0x80: nameLen=(nameLen&0x7F)<<24 | ord(s[off])<<16 | ord(s[off+1])<<8 | ord(s[off+2]) off += 3 valueLen=ord(s[off]) off += 1 if valueLen&0x80: valueLen=(valueLen&0x7F)<<24 | ord(s[off])<<16 | ord(s[off+1])<<8 | ord(s[off+2]) off += 3 yield (s[off:off+nameLen], s[off+nameLen:off+nameLen+valueLen]) off += nameLen + valueLen def getLenBytes(length): if length<0x80: return chr(length) elif 0 < length <= 0x7FFFFFFF: return (chr(0x80|(length>>24)&0x7F) + chr((length>>16)&0xFF) + chr((length>>8)&0xFF) + chr(length&0xFF)) else: raise ValueError("Name length too long.") def writeNameValue(name, value): return getLenBytes(len(name)) + getLenBytes(len(value)) + name + value class FastCGIChannelRequest(cgi.BaseCGIChannelRequest): maxConnections = 100 reqId = 0 request = None ## High level protocol def packetReceived(self, packet): ''' @param packet: instance of C{fastcgi.Record}. @raise: FastCGIError on invalid version or where the type does not exist in funName ''' if packet.version != 1: raise FastCGIError("FastCGI packet received with version != 1") funName = typeNames.get(packet.type) if funName is None: raise FastCGIError("Unknown FastCGI packet type: %d" % packet.type) getattr(self, funName)(packet) def fcgi_get_values(self, packet): if packet.reqId != 0: raise ValueError("Should be 0!") content = "" for name,value in parseNameValues(packet.content): outval = None if name == "FCGI_MAX_CONNS": outval = str(self.maxConnections) elif name == "FCGI_MAX_REQS": outval = str(self.maxConnections) elif name == "FCGI_MPXS_CONNS": outval = "0" if outval: content += writeNameValue(name, outval) self.writePacket(Record(FCGI_GET_VALUES_RESULT, 0, content)) def fcgi_unknown_type(self, packet): # Unused, reserved for future expansion pass def fcgi_begin_request(self, packet): role = ord(packet.content[0])<<8 | ord(packet.content[1]) flags = ord(packet.content[2]) if packet.reqId == 0: raise ValueError("ReqId shouldn't be 0!") if self.reqId != 0: self.writePacket(Record(FCGI_END_REQUEST, self.reqId, "\0\0\0\0"+chr(FCGI_CANT_MPX_CONN)+"\0\0\0")) if role != FCGI_RESPONDER: self.writePacket(Record(FCGI_END_REQUEST, self.reqId, "\0\0\0\0"+chr(FCGI_UNKNOWN_ROLE)+"\0\0\0")) self.reqId = packet.reqId self.keepalive = flags & FCGI_KEEP_CONN self.params = "" def fcgi_abort_request(self, packet): if packet.reqId != self.reqId: return self.request.connectionLost() def fcgi_params(self, packet): if packet.reqId != self.reqId: return # I don't feel like doing the work to incrementally parse this stupid # protocol, so we'll just buffer all the params data before parsing. if not packet.content: self.makeRequest(dict(parseNameValues(self.params))) self.request.process() self.params += packet.content def fcgi_stdin(self, packet): if packet.reqId != self.reqId: return if not packet.content: self.request.handleContentComplete() else: self.request.handleContentChunk(packet.content) def fcgi_data(self, packet): # For filter roles only, which is currently unsupported. pass def write(self, data): if len(data) > FCGI_MAX_PACKET_LEN: n = 0 while 1: d = data[n*FCGI_MAX_PACKET_LEN:(n+1)*FCGI_MAX_PACKET_LEN] if not d: break self.write(d) return self.writePacket(Record(FCGI_STDOUT, self.reqId, data)) def writeHeaders(self, code, headers): l = [] code_message = responsecode.RESPONSES.get(code, "Unknown Status") l.append("Status: %s %s\n" % (code, code_message)) if headers is not None: for name, valuelist in headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\n" % (name, value)) l.append('\n') self.write(''.join(l)) def finish(self): if self.request is None: raise RuntimeError("Request.finish called when no request was outstanding.") self.writePacket(Record(FCGI_END_REQUEST, self.reqId, "\0\0\0\0"+chr(FCGI_REQUEST_COMPLETE)+"\0\0\0")) del self.reqId, self.request if not self.keepalive: self.transport.loseConnection() ## Low level protocol paused = False _lastRecord = None recvd = "" def writePacket(self, packet): #print "Writing record", packet self.transport.write(packet.toOutputString()) def dataReceived(self, recd): self.recvd = self.recvd + recd record = self._lastRecord self._lastRecord = None while len(self.recvd) >= 8 and not self.paused: if record is None: record = Record.fromHeaderString(self.recvd[:8]) if len(self.recvd) < record.totalLength(): self._lastRecord = record break record.content = self.recvd[8:record.length+8] self.recvd = self.recvd[record.totalLength():] self.packetReceived(record) record = None def pauseProducing(self): self.paused = True self.transport.pauseProducing() def resumeProducing(self): self.paused = False self.transport.resumeProducing() self.dataReceived('') def stopProducing(self): self.paused = True self.transport.stopProducing() class FastCGIFactory(protocol.ServerFactory): protocol = FastCGIChannelRequest def __init__(self, requestFactory): self.requestFactory=requestFactory def buildProtocol(self, addr): p = protocol.ServerFactory.buildProtocol(self, addr) p.requestFactory=self.requestFactory return p # import socket # import fcntl # from twisted.web2 import tcp # class ExistingFDTCPPort(tcp.Port): # def __init__(self, socknum, factory): # tcp.Port.__init__(self, 0, factory) # # Part of base.createInternetSocket # skt = socket.fromfd(socknum, self.addressFamily, self.socketType) # skt.setblocking(0) # if fcntl and hasattr(fcntl, 'FD_CLOEXEC'): # old = fcntl.fcntl(skt.fileno(), fcntl.F_GETFD) # fcntl.fcntl(skt.fileno(), fcntl.F_SETFD, old | fcntl.FD_CLOEXEC) # # Part of tcp.startListening # self._realPortNumber = skt.getsockname()[1] # log.msg("%s starting on %s" % (self.factory.__class__, self._realPortNumber)) # # The order of the next 6 lines is kind of bizarre. If no one # # can explain it, perhaps we should re-arrange them. # self.factory.doStart() # skt.listen(self.backlog) # self.connected = 1 # self.socket = skt # self.fileno = self.socket.fileno # self.numberAccepts = 100 # self.startReading() # def startListening(self): # raise NotImplementedError("Cannot startListening on an ExistingFDTCPPort") TwistedWeb2-8.1.0/twisted/web2/channel/cgi.py0000644000175000017500000001175610365066615017425 0ustar dokodokoimport warnings import os import urllib from zope.interface import implements from twisted.internet import protocol, address from twisted.internet import reactor, interfaces from twisted.web2 import http, http_headers, server, responsecode class BaseCGIChannelRequest(protocol.Protocol): implements(interfaces.IHalfCloseableProtocol) finished = False requestFactory = http.Request request = None def makeRequest(self, vars): headers = http_headers.Headers() http_vers = http.parseVersion(vars['SERVER_PROTOCOL']) if http_vers[0] != 'http' or http_vers[1] > 1: _abortWithError(responsecode.INTERNAL_SERVER_ERROR, "Twisted.web CGITransport: Unknown HTTP version: " % vars['SERVER_PROTOCOL']) secure = vars.get("HTTPS") in ("1", "on") # apache extension? port = vars.get('SERVER_PORT') or 80 server_host = vars.get('SERVER_NAME') or vars.get('SERVER_ADDR') or 'localhost' self.hostinfo = address.IPv4Address('TCP', server_host, port), bool(secure) self.remoteinfo = address.IPv4Address( 'TCP', vars.get('REMOTE_ADDR', ''), vars.get('REMOTE_PORT', 0)) uri = vars.get('REQUEST_URI') # apache extension? if not uri: qstr = vars.get('QUERY_STRING', '') if qstr: qstr = "?"+urllib.quote(qstr, safe="") uri = urllib.quote(vars['SCRIPT_NAME'])+urllib.quote(vars.get('PATH_INFO', ''))+qstr for name,val in vars.iteritems(): if name.startswith('HTTP_'): name = name[5:].replace('_', '-') elif name == 'CONTENT_TYPE': name = 'content-type' else: continue headers.setRawHeaders(name, (val,)) self._dataRemaining = int(vars.get('CONTENT_LENGTH', '0')) self.request = self.requestFactory(self, vars['REQUEST_METHOD'], uri, http_vers[1:3], self._dataRemaining, headers, prepathuri=vars['SCRIPT_NAME']) def writeIntermediateResponse(self, code, headers=None): """Ignore, CGI doesn't support.""" pass def write(self, data): self.transport.write(data) def finish(self): if self.finished: warnings.warn("Warning! request.finish called twice.", stacklevel=2) return self.finished = True self.transport.loseConnection() def getHostInfo(self): return self.hostinfo def getRemoteHost(self): return self.remoteinfo def abortConnection(self, closeWrite=True): self.transport.loseConnection() def registerProducer(self, producer, streaming): self.transport.registerProducer(producer, streaming) def unregisterProducer(self): self.transport.unregisterProducer() def writeConnectionLost(self): self.loseConnection() def readConnectionLost(self): if self._dataRemaining > 0: # content-length was wrong, abort self.loseConnection() class CGIChannelRequest(BaseCGIChannelRequest): cgi_vers = (1, 0) def __init__(self, requestFactory, vars): self.requestFactory=requestFactory cgi_vers = http.parseVersion(vars['GATEWAY_INTERFACE']) if cgi_vers[0] != 'cgi' or cgi_vers[1] != 1: _abortWithError(responsecode.INTERNAL_SERVER_ERROR, "Twisted.web CGITransport: Unknown CGI version %s" % vars['GATEWAY_INTERFACE']) self.makeRequest(vars) def writeHeaders(self, code, headers): l = [] code_message = responsecode.RESPONSES.get(code, "Unknown Status") l.append("Status: %s %s\n" % (code, code_message)) if headers is not None: for name, valuelist in headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\n" % (name, value)) l.append('\n') self.transport.writeSequence(l) def dataReceived(self, data): if self._dataRemaining <= 0: return if self._dataRemaining < len(data): data = data[:self._dataRemaining] self._dataRemaining -= len(data) self.request.handleContentChunk(data) if self._dataRemaining == 0: self.request.handleContentComplete() def connectionMade(self): self.request.process() if self._dataRemaining == 0: self.request.handleContentComplete() def connectionLost(self, reason): if reactor.running: reactor.stop() def startCGI(site): """Call this as the last thing in your CGI python script in order to hook up your site object with the incoming request. E.g.: >>> from twisted.web2 import channel, server >>> if __name__ == '__main__': ... channel.startCGI(server.Site(myToplevelResource)) """ from twisted.internet.stdio import StandardIO StandardIO(CGIChannelRequest(site, os.environ)) reactor.run() __all__ = ['startCGI'] TwistedWeb2-8.1.0/twisted/web2/channel/scgi.py0000644000175000017500000000623210261415176017574 0ustar dokodokofrom twisted.internet import protocol from twisted.web2 import responsecode from twisted.web2.channel import cgi as cgichannel class SCGIChannelRequest(cgichannel.BaseCGIChannelRequest): scgi_vers = "1" _data = "" headerLen = None def __init__(self): pass def writeHeaders(self, code, headers): l = [] code_message = responsecode.RESPONSES.get(code, "Unknown Status") l.append("Status: %s %s\n" % (code, code_message)) if headers is not None: for name, valuelist in headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\r\n" % (name, value)) l.append('\r\n') self.transport.writeSequence(l) def makeRequest(self, vars): scgi_vers = vars['SCGI'] if scgi_vers != self.scgi_vers: _abortWithError(responsecode.INTERNAL_SERVER_ERROR, "Twisted.web SCGITransport: Unknown SCGI version %s" % vars['SCGI']) cgichannel.BaseCGIChannelRequest.makeRequest(self, vars) def dataReceived(self, data): if self.request is None: # Reading headers self._data += data if self.headerLen is None: # Haven't gotten a length prefix yet datas = data.split(':', 1) if len(datas) == 1: return self.headerLen = int(datas[0]) + 1 # +1 for the "," at the end self._data = datas[1] if len(self._data) >= self.headerLen: # Got all headers headerdata=self._data[:self.headerLen] data=self._data[self.headerLen:] items = headerdata.split('\0') assert (len(items) % 2) == 1, "malformed headers" assert items[-1]==',' env = {} for i in range(0, len(items) - 1, 2): env[items[i]] = items[i+1] self.makeRequest(env) self.request.process() if self._dataRemaining == 0: self.request.handleContentComplete() return if not data: return # no extra data in this packet # Fall through, self.request is now set, handle data else: return if self._dataRemaining <= 0: return if self._dataRemaining < len(data): data = data[:self._dataRemaining] self._dataRemaining -= len(data) self.request.handleContentChunk(data) if self._dataRemaining == 0: self.request.handleContentComplete() def connectionLost(self, reason): if self.request is not None: self.request.connectionLost(reason) class SCGIFactory(protocol.ServerFactory): protocol = SCGIChannelRequest def __init__(self, requestFactory): self.requestFactory=requestFactory def buildProtocol(self, addr): p = protocol.ServerFactory.buildProtocol(self, addr) p.requestFactory=self.requestFactory return p __all__ = ['SCGIFactory'] TwistedWeb2-8.1.0/twisted/web2/twcgi.py0000644000175000017500000002773010634575711016370 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_cgi -*- # Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ I hold resource classes and helper classes that deal with CGI scripts. Things which are still not working properly: - CGIScript.render doesn't set REMOTE_ADDR or REMOTE_HOST in the environment """ # System Imports import os import sys import urllib # Twisted Imports from twisted.internet import defer, protocol, reactor from twisted.python import log, filepath # Sibling Imports from twisted.web2 import http from twisted.web2 import resource from twisted.web2 import responsecode from twisted.web2 import server from twisted.web2 import stream headerNameTranslation = ''.join([c.isalnum() and c.upper() or '_' for c in map(chr, range(256))]) def createCGIEnvironment(request): # See http://hoohoo.ncsa.uiuc.edu/cgi/env.html for CGI interface spec # http://cgi-spec.golux.com/draft-coar-cgi-v11-03-clean.html for a better one remotehost = request.remoteAddr python_path = os.pathsep.join(sys.path) env = dict(os.environ) # MUST provide: if request.stream.length: env["CONTENT_LENGTH"] = str(request.stream.length) ctype = request.headers.getRawHeaders('content-type') if ctype: env["CONTENT_TYPE"] = ctype[0] env["GATEWAY_INTERFACE"] = "CGI/1.1" if request.postpath: # Should we raise an exception if this contains "/" chars? env["PATH_INFO"] = '/' + '/'.join(request.postpath) # MUST always be present, even if no query env["QUERY_STRING"] = request.querystring env["REMOTE_ADDR"] = remotehost.host env["REQUEST_METHOD"] = request.method # Should we raise an exception if this contains "/" chars? if request.prepath: env["SCRIPT_NAME"] = '/' + '/'.join(request.prepath) else: env["SCRIPT_NAME"] = '' env["SERVER_NAME"] = request.host env["SERVER_PORT"] = str(request.port) env["SERVER_PROTOCOL"] = "HTTP/%i.%i" % request.clientproto env["SERVER_SOFTWARE"] = server.VERSION # SHOULD provide # env["AUTH_TYPE"] # FIXME: add this # env["REMOTE_HOST"] # possibly dns resolve? # MAY provide # env["PATH_TRANSLATED"] # Completely worthless # env["REMOTE_IDENT"] # Completely worthless # env["REMOTE_USER"] # FIXME: add this # Unofficial, but useful and expected by applications nonetheless env["REMOTE_PORT"] = str(remotehost.port) env["REQUEST_SCHEME"] = request.scheme env["REQUEST_URI"] = request.uri env["HTTPS"] = ("off", "on")[request.scheme=="https"] env["SERVER_PORT_SECURE"] = ("0", "1")[request.scheme=="https"] # Propagate HTTP headers for title, header in request.headers.getAllRawHeaders(): envname = title.translate(headerNameTranslation) # Don't send headers we already sent otherwise, and don't # send authorization headers, because that's a security # issue. if title not in ('content-type', 'content-length', 'authorization', 'proxy-authorization'): envname = "HTTP_" + envname env[envname] = ','.join(header) for k,v in env.items(): if type(k) is not str: print "is not string:",k if type(v) is not str: print k, "is not string:",v return env def runCGI(request, filename, filterscript=None): # Make sure that we don't have an unknown content-length if request.stream.length is None: return http.Response(responsecode.LENGTH_REQUIRED) env = createCGIEnvironment(request) env['SCRIPT_FILENAME'] = filename if '=' in request.querystring: qargs = [] else: qargs = [urllib.unquote(x) for x in request.querystring.split('+')] if filterscript is None: filterscript = filename qargs = [filename] + qargs else: qargs = [filterscript, filename] + qargs d = defer.Deferred() proc = CGIProcessProtocol(request, d) reactor.spawnProcess(proc, filterscript, qargs, env, os.path.dirname(filename)) return d class CGIScript(resource.LeafResource): """I represent a CGI script. My implementation is complex due to the fact that it requires asynchronous IPC with an external process with an unpleasant protocol. """ def __init__(self, filename): """Initialize, with the name of a CGI script file. """ self.filename = filename resource.LeafResource.__init__(self) def render(self, request): """Do various things to conform to the CGI specification. I will set up the usual slew of environment variables, then spin off a process. """ return runCGI(request, self.filename) def http_POST(self, request): return self.render(request) class FilteredScript(CGIScript): """ I am a special version of a CGI script, that uses a specific executable (or, the first existing executable in a list of executables). This is useful for interfacing with other scripting languages that adhere to the CGI standard (cf. PHPScript). My 'filters' attribute specifies what executables to try to run, and my 'filename' init parameter describes which script to pass to the first argument of that script. """ filters = '/usr/bin/cat', def __init__(self, filename, filters=None): if filters is not None: self.filters = filters CGIScript.__init__(self, filename) def render(self, request): for filterscript in self.filters: if os.path.exists(filterscript): return runCGI(request, self.filename, filterscript) else: log.err(self.__class__.__name__ + ' could not find any of: ' + ', '.join(self.filters)) return http.Response(responsecode.INTERNAL_SERVER_ERROR) class PHP3Script(FilteredScript): """I am a FilteredScript that uses the default PHP3 command on most systems. """ filters = '/usr/bin/php3', class PHPScript(FilteredScript): """I am a FilteredScript that uses the PHP command on most systems. Sometimes, php wants the path to itself as argv[0]. This is that time. """ filters = '/usr/bin/php4-cgi', '/usr/bin/php4' class CGIProcessProtocol(protocol.ProcessProtocol): handling_headers = 1 headers_written = 0 headertext = '' errortext = '' def resumeProducing(self): self.transport.resumeProducing() def pauseProducing(self): self.transport.pauseProducing() def stopProducing(self): self.transport.loseConnection() def __init__(self, request, deferred): self.request = request self.deferred = deferred self.stream = stream.ProducerStream() self.response = http.Response(stream=self.stream) def connectionMade(self): # Send input data over to the CGI script. def _failedProducing(reason): # If you really care. #log.err(reason) pass def _finishedProducing(result): self.transport.closeChildFD(0) s = stream.StreamProducer(self.request.stream) producingDeferred = s.beginProducing(self.transport) producingDeferred.addCallback(_finishedProducing) producingDeferred.addErrback(_failedProducing) def errReceived(self, error): self.errortext = self.errortext + error def outReceived(self, output): """ Handle a chunk of input """ # First, make sure that the headers from the script are sorted # out (we'll want to do some parsing on these later.) if self.handling_headers: fullText = self.headertext + output header_endings = [] for delimiter in '\n\n','\r\n\r\n','\r\r', '\n\r\n': headerend = fullText.find(delimiter) if headerend != -1: header_endings.append((headerend, delimiter)) # Have we noticed the end of our headers in this chunk? if header_endings: header_endings.sort() headerend, delimiter = header_endings[0] # This is a final version of the header text. self.headertext = fullText[:headerend] linebreak = delimiter[:len(delimiter)/2] # Write all our headers to self.response for header in self.headertext.split(linebreak): self._addResponseHeader(header) output = fullText[headerend+len(delimiter):] self.handling_headers = 0 # Trigger our callback with a response self._sendResponse() # If we haven't hit the end of our headers yet, then # everything we've seen so far is _still_ headers if self.handling_headers: self.headertext = fullText # If we've stopped handling headers at this point, write # whatever output we've got. if not self.handling_headers: self.stream.write(output) def _addResponseHeader(self, header): """ Save a header until we're ready to write our Response. """ breakpoint = header.find(': ') if breakpoint == -1: log.msg('ignoring malformed CGI header: %s' % header) else: name = header.lower()[:breakpoint] text = header[breakpoint+2:] if name == 'status': try: # "123 " sometimes happens. self.response.code = int(text.split(' ', 1)[0]) except: log.msg("malformed status header: %s" % header) else: self.response.headers.addRawHeader(name, text) def processEnded(self, reason): if reason.value.exitCode != 0: log.msg("CGI %s exited with exit code %s" % (self.request.uri, reason.value.exitCode)) if self.errortext: log.msg("Errors from CGI %s: %s" % (self.request.uri, self.errortext)) if self.handling_headers: log.msg("Premature end of headers in %s: %s" % (self.request.uri, self.headertext)) self.response = http.Response(responsecode.INTERNAL_SERVER_ERROR) self._sendResponse() self.stream.finish() def _sendResponse(self): """ Call our deferred (from CGIScript.render) with a response. """ # Fix up location stuff loc = self.response.headers.getHeader('location') if loc and self.response.code == responsecode.OK: if loc[0] == '/': # FIXME: Do internal redirect raise RuntimeError("Sorry, internal redirects not implemented yet.") else: # NOTE: if a script wants to output its own redirect body, # it must specify Status: 302 itself. self.response.code = 302 self.response.stream = None self.deferred.callback(self.response) class CGIDirectory(resource.Resource, filepath.FilePath): """A directory that serves only CGI scripts (to infinite depth) and does not support directory listings. @param pathname: A path to the directory that you wish to serve CGI scripts from, for example /var/www/cgi-bin/ @type pathname: str """ addSlash = True def __init__(self, pathname): resource.Resource.__init__(self) filepath.FilePath.__init__(self, pathname) def locateChild(self, request, segments): fnp = self.child(segments[0]) if not fnp.exists(): raise http.HTTPError(responsecode.NOT_FOUND) elif fnp.isdir(): return CGIDirectory(fnp.path), segments[1:] else: return CGIScript(fnp.path), segments[1:] return None, () def render(self, request): errormsg = 'CGI directories do not support directory listing' return http.Response(responsecode.FORBIDDEN) __all__ = ['createCGIEnvironment', 'CGIDirectory', 'CGIScript', 'FilteredScript', 'PHP3Script', 'PHPScript'] TwistedWeb2-8.1.0/twisted/web2/TODO0000644000175000017500000000124110256455671015360 0ustar dokodoko- More Tests! - Check if closing input stream does right thing (should close connection after response is written) - Low level protocol: - Authentication (basic and digest) - Close connection if resource doesn't read entire input stream - sendfile (initial sketch in stream.py, but needs work in twisted, too) - Add parsers/generators for the rest of the headers: - Cache-Control, Pragma, Trailer, Upgrade, Via, Warning - Authorization, WWW-Authenticate - Cookies - Application server: - Other useful filters: content-encoding, caching, others? - Client library *LATER* - Convert proxy, distrib - Internal request submission API - FastCGI client/server TwistedWeb2-8.1.0/twisted/web2/http.py0000644000175000017500000004227310456304373016225 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_http -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """HyperText Transfer Protocol implementation. The second coming. Maintainer: U{James Y Knight } """ # import traceback; log.msg(''.join(traceback.format_stack())) # system imports import socket import time import cgi # twisted imports from twisted.internet import interfaces, error from twisted.python import log, components from zope.interface import implements # sibling imports from twisted.web2 import responsecode from twisted.web2 import http_headers from twisted.web2 import iweb from twisted.web2 import stream from twisted.web2.stream import IByteStream defaultPortForScheme = {'http': 80, 'https':443, 'ftp':21} def splitHostPort(scheme, hostport): """Split the host in "host:port" format into host and port fields. If port was not specified, use the default for the given scheme, if known. Returns a tuple of (hostname, portnumber).""" # Split hostport into host and port hostport = hostport.split(':', 1) try: if len(hostport) == 2: return hostport[0], int(hostport[1]) except ValueError: pass return hostport[0], defaultPortForScheme.get(scheme, 0) def parseVersion(strversion): """Parse version strings of the form Protocol '/' Major '.' Minor. E.g. 'HTTP/1.1'. Returns (protocol, major, minor). Will raise ValueError on bad syntax.""" proto, strversion = strversion.split('/') major, minor = strversion.split('.') major, minor = int(major), int(minor) if major < 0 or minor < 0: raise ValueError("negative number") return (proto.lower(), major, minor) class HTTPError(Exception): def __init__(self, codeOrResponse): """An Exception for propagating HTTP Error Responses. @param codeOrResponse: The numeric HTTP code or a complete http.Response object. @type codeOrResponse: C{int} or L{http.Response} """ Exception.__init__(self) self.response = iweb.IResponse(codeOrResponse) def __repr__(self): return "<%s %s>" % (self.__class__.__name__, self.response) class Response(object): """An object representing an HTTP Response to be sent to the client. """ implements(iweb.IResponse) code = responsecode.OK headers = None stream = None def __init__(self, code=None, headers=None, stream=None): """ @param code: The HTTP status code for this Response @type code: C{int} @param headers: Headers to be sent to the client. @type headers: C{dict}, L{twisted.web2.http_headers.Headers}, or C{None} @param stream: Content body to send to the HTTP client @type stream: L{twisted.web2.stream.IByteStream} """ if code is not None: self.code = int(code) if headers is not None: if isinstance(headers, dict): headers = http_headers.Headers(headers) self.headers=headers else: self.headers = http_headers.Headers() if stream is not None: self.stream = IByteStream(stream) def __repr__(self): if self.stream is None: streamlen = None else: streamlen = self.stream.length return "<%s.%s code=%d, streamlen=%s>" % (self.__module__, self.__class__.__name__, self.code, streamlen) class StatusResponse (Response): """ A L{Response} object which simply contains a status code and a description of what happened. """ def __init__(self, code, description, title=None): """ @param code: a response code in L{responsecode.RESPONSES}. @param description: a string description. @param title: the message title. If not specified or C{None}, defaults to C{responsecode.RESPONSES[code]}. """ if title is None: title = cgi.escape(responsecode.RESPONSES[code]) output = "".join(( "", "", "%s" % (title,), "", "", "

%s

" % (title,), "

%s

" % (cgi.escape(description),), "", "", )) if type(output) == unicode: output = output.encode("utf-8") mime_params = {"charset": "utf-8"} else: mime_params = {} super(StatusResponse, self).__init__(code=code, stream=output) self.headers.setHeader("content-type", http_headers.MimeType("text", "html", mime_params)) self.description = description def __repr__(self): return "<%s %s %s>" % (self.__class__.__name__, self.code, self.description) class RedirectResponse (StatusResponse): """ A L{Response} object that contains a redirect to another network location. """ def __init__(self, location): """ @param location: the URI to redirect to. """ super(RedirectResponse, self).__init__( responsecode.MOVED_PERMANENTLY, "Document moved to %s." % (location,) ) self.headers.setHeader("location", location) def NotModifiedResponse(oldResponse=None): if oldResponse is not None: headers=http_headers.Headers() for header in ( # Required from sec 10.3.5: 'date', 'etag', 'content-location', 'expires', 'cache-control', 'vary', # Others: 'server', 'proxy-authenticate', 'www-authenticate', 'warning'): value = oldResponse.headers.getRawHeaders(header) if value is not None: headers.setRawHeaders(header, value) else: headers = None return Response(code=responsecode.NOT_MODIFIED, headers=headers) def checkPreconditions(request, response=None, entityExists=True, etag=None, lastModified=None): """Check to see if this request passes the conditional checks specified by the client. May raise an HTTPError with result codes L{NOT_MODIFIED} or L{PRECONDITION_FAILED}, as appropriate. This function is called automatically as an output filter for GET and HEAD requests. With GET/HEAD, it is not important for the precondition check to occur before doing the action, as the method is non-destructive. However, if you are implementing other request methods, like PUT for your resource, you will need to call this after determining the etag and last-modified time of the existing resource but before actually doing the requested action. In that case, This examines the appropriate request headers for conditionals, (If-Modified-Since, If-Unmodified-Since, If-Match, If-None-Match, or If-Range), compares with the etag and last and and then sets the response code as necessary. @param response: This should be provided for GET/HEAD methods. If it is specified, the etag and lastModified arguments will be retrieved automatically from the response headers and shouldn't be separately specified. Not providing the response with a GET request may cause the emitted "Not Modified" responses to be non-conformant. @param entityExists: Set to False if the entity in question doesn't yet exist. Necessary for PUT support with 'If-None-Match: *'. @param etag: The etag of the resource to check against, or None. @param lastModified: The last modified date of the resource to check against, or None. @raise: HTTPError: Raised when the preconditions fail, in order to abort processing and emit an error page. """ if response: assert etag is None and lastModified is None # if the code is some sort of error code, don't do anything if not ((response.code >= 200 and response.code <= 299) or response.code == responsecode.PRECONDITION_FAILED): return False etag = response.headers.getHeader("etag") lastModified = response.headers.getHeader("last-modified") def matchETag(tags, allowWeak): if entityExists and '*' in tags: return True if etag is None: return False return ((allowWeak or not etag.weak) and ([etagmatch for etagmatch in tags if etag.match(etagmatch, strongCompare=not allowWeak)])) # First check if-match/if-unmodified-since # If either one fails, we return PRECONDITION_FAILED match = request.headers.getHeader("if-match") if match: if not matchETag(match, False): raise HTTPError(StatusResponse(responsecode.PRECONDITION_FAILED, "Requested resource does not have a matching ETag.")) unmod_since = request.headers.getHeader("if-unmodified-since") if unmod_since: if not lastModified or lastModified > unmod_since: raise HTTPError(StatusResponse(responsecode.PRECONDITION_FAILED, "Requested resource has changed.")) # Now check if-none-match/if-modified-since. # This bit is tricky, because of the requirements when both IMS and INM # are present. In that case, you can't return a failure code # unless *both* checks think it failed. # Also, if the INM check succeeds, ignore IMS, because INM is treated # as more reliable. # I hope I got the logic right here...the RFC is quite poorly written # in this area. Someone might want to verify the testcase against # RFC wording. # If IMS header is later than current time, ignore it. notModified = None ims = request.headers.getHeader('if-modified-since') if ims: notModified = (ims < time.time() and lastModified and lastModified <= ims) inm = request.headers.getHeader("if-none-match") if inm: if request.method in ("HEAD", "GET"): # If it's a range request, don't allow a weak ETag, as that # would break. canBeWeak = not request.headers.hasHeader('Range') if notModified != False and matchETag(inm, canBeWeak): raise HTTPError(NotModifiedResponse(response)) else: if notModified != False and matchETag(inm, False): raise HTTPError(StatusResponse(responsecode.PRECONDITION_FAILED, "Requested resource has a matching ETag.")) else: if notModified == True: if request.method in ("HEAD", "GET"): raise HTTPError(NotModifiedResponse(response)) else: # S14.25 doesn't actually say what to do for a failing IMS on # non-GET methods. But Precondition Failed makes sense to me. raise HTTPError(StatusResponse(responsecode.PRECONDITION_FAILED, "Requested resource has not changed.")) def checkIfRange(request, response): """Checks for the If-Range header, and if it exists, checks if the test passes. Returns true if the server should return partial data.""" ifrange = request.headers.getHeader("if-range") if ifrange is None: return True if isinstance(ifrange, http_headers.ETag): return ifrange.match(response.headers.getHeader("etag"), strongCompare=True) else: return ifrange == response.headers.getHeader("last-modified") class _NotifyingProducerStream(stream.ProducerStream): doStartReading = None def __init__(self, length=None, doStartReading=None): stream.ProducerStream.__init__(self, length=length) self.doStartReading = doStartReading def read(self): if self.doStartReading is not None: doStartReading = self.doStartReading self.doStartReading = None doStartReading() return stream.ProducerStream.read(self) def write(self, data): self.doStartReading = None stream.ProducerStream.write(self, data) def finish(self): self.doStartReading = None stream.ProducerStream.finish(self) # response codes that must have empty bodies NO_BODY_CODES = (responsecode.NO_CONTENT, responsecode.NOT_MODIFIED) class Request(object): """A HTTP request. Subclasses should override the process() method to determine how the request will be processed. @ivar method: The HTTP method that was used. @ivar uri: The full URI that was requested (includes arguments). @ivar headers: All received headers @ivar clientproto: client HTTP version @ivar stream: incoming data stream. """ implements(iweb.IRequest, interfaces.IConsumer) known_expects = ('100-continue',) def __init__(self, chanRequest, command, path, version, contentLength, headers): """ @param chanRequest: the channel request we're associated with. """ self.chanRequest = chanRequest self.method = command self.uri = path self.clientproto = version self.headers = headers if '100-continue' in self.headers.getHeader('expect', ()): doStartReading = self._sendContinue else: doStartReading = None self.stream = _NotifyingProducerStream(contentLength, doStartReading) self.stream.registerProducer(self.chanRequest, True) def checkExpect(self): """Ensure there are no expectations that cannot be met. Checks Expect header against self.known_expects.""" expects = self.headers.getHeader('expect', ()) for expect in expects: if expect not in self.known_expects: raise HTTPError(responsecode.EXPECTATION_FAILED) def process(self): """Called by channel to let you process the request. Can be overridden by a subclass to do something useful.""" pass def handleContentChunk(self, data): """Callback from channel when a piece of data has been received. Puts the data in .stream""" self.stream.write(data) def handleContentComplete(self): """Callback from channel when all data has been received. """ self.stream.unregisterProducer() self.stream.finish() def connectionLost(self, reason): """connection was lost""" pass def __repr__(self): return '<%s %s %s>'% (self.method, self.uri, self.clientproto) def _sendContinue(self): self.chanRequest.writeIntermediateResponse(responsecode.CONTINUE) def _finished(self, x): """We are finished writing data.""" self.chanRequest.finish() def _error(self, reason): if reason.check(error.ConnectionLost): log.msg("Request error: " + reason.getErrorMessage()) else: log.err(reason) # Only bother with cleanup on errors other than lost connection. self.chanRequest.abortConnection() def writeResponse(self, response): """ Write a response. """ if self.stream.doStartReading is not None: # Expect: 100-continue was requested, but 100 response has not been # sent, and there's a possibility that data is still waiting to be # sent. # # Ideally this means the remote side will not send any data. # However, because of compatibility requirements, it might timeout, # and decide to do so anyways at the same time we're sending back # this response. Thus, the read state is unknown after this. # We must close the connection. self.chanRequest.channel.setReadPersistent(False) # Nothing more will be read self.chanRequest.allContentReceived() if response.code != responsecode.NOT_MODIFIED: # Not modified response is *special* and doesn't get a content-length. if response.stream is None: response.headers.setHeader('content-length', 0) elif response.stream.length is not None: response.headers.setHeader('content-length', response.stream.length) self.chanRequest.writeHeaders(response.code, response.headers) # if this is a "HEAD" request, or a special response code, # don't return any data. if self.method == "HEAD" or response.code in NO_BODY_CODES: if response.stream is not None: response.stream.close() self._finished(None) return d = stream.StreamProducer(response.stream).beginProducing(self.chanRequest) d.addCallback(self._finished).addErrback(self._error) from twisted.web2 import compat components.registerAdapter(compat.makeOldRequestAdapter, iweb.IRequest, iweb.IOldRequest) components.registerAdapter(compat.OldNevowResourceAdapter, iweb.IOldNevowResource, iweb.IResource) components.registerAdapter(Response, int, iweb.IResponse) try: # If twisted.web is installed, add an adapter for it from twisted.web import resource except: pass else: components.registerAdapter(compat.OldResourceAdapter, resource.IResource, iweb.IOldNevowResource) __all__ = ['HTTPError', 'NotModifiedResponse', 'Request', 'Response', 'checkIfRange', 'checkPreconditions', 'defaultPortForScheme', 'parseVersion', 'splitHostPort'] TwistedWeb2-8.1.0/twisted/web2/proxy.py0000644000175000017500000001540010451641474016420 0ustar dokodokoraise ImportError("FIXME: this file probably doesn't work.") # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """Simplistic HTTP proxy support. This comes in two main variants - the Proxy and the ReverseProxy. When a Proxy is in use, a browser trying to connect to a server (say, www.yahoo.com) will be intercepted by the Proxy, and the proxy will covertly connect to the server, and return the result. When a ReverseProxy is in use, the client connects directly to the ReverseProxy (say, www.yahoo.com) which farms off the request to one of a pool of servers, and returns the result. Normally, a Proxy is used on the client end of an Internet connection, while a ReverseProxy is used on the server end. """ # twisted imports from twisted.web2 import http from twisted.internet import reactor, protocol from twisted.web2 import resource, server from zope.interface import implements, Interface # system imports import urlparse class ProxyClient(http.HTTPClient): """Used by ProxyClientFactory to implement a simple web proxy.""" def __init__(self, command, rest, version, headers, data, father): self.father = father self.command = command self.rest = rest if headers.has_key("proxy-connection"): del headers["proxy-connection"] headers["connection"] = "close" self.headers = headers self.data = data def connectionMade(self): self.sendCommand(self.command, self.rest) for header, value in self.headers.items(): self.sendHeader(header, value) self.endHeaders() self.transport.write(self.data) def handleStatus(self, version, code, message): self.father.transport.write("%s %s %s\r\n" % (version, code, message)) def handleHeader(self, key, value): self.father.transport.write("%s: %s\r\n" % (key, value)) def handleEndHeaders(self): self.father.transport.write("\r\n") def handleResponsePart(self, buffer): self.father.transport.write(buffer) def handleResponseEnd(self): self.transport.loseConnection() self.father.channel.transport.loseConnection() class ProxyClientFactory(protocol.ClientFactory): """Used by ProxyRequest to implement a simple web proxy.""" def __init__(self, command, rest, version, headers, data, father): self.father = father self.command = command self.rest = rest self.headers = headers self.data = data self.version = version def buildProtocol(self, addr): return ProxyClient(self.command, self.rest, self.version, self.headers, self.data, self.father) def clientConnectionFailed(self, connector, reason): self.father.transport.write("HTTP/1.0 501 Gateway error\r\n") self.father.transport.write("Content-Type: text/html\r\n") self.father.transport.write("\r\n") self.father.transport.write('''

Could not connect

''') class ProxyRequest(http.Request): """Used by Proxy to implement a simple web proxy.""" protocols = {'http': ProxyClientFactory} ports = {'http': 80} def process(self): parsed = urlparse.urlparse(self.uri) protocol = parsed[0] host = parsed[1] port = self.ports[protocol] if ':' in host: host, port = host.split(':') port = int(port) rest = urlparse.urlunparse(('','')+parsed[2:]) if not rest: rest = rest+'/' class_ = self.protocols[protocol] headers = self.getAllHeaders().copy() if not headers.has_key('host'): headers['host'] = host self.content.seek(0, 0) s = self.content.read() clientFactory = class_(self.method, rest, self.clientproto, headers, s, self) reactor.connectTCP(host, port, clientFactory) class Proxy(http.HTTPChannel): """This class implements a simple web proxy. Since it inherits from twisted.protocols.http.HTTPChannel, to use it you should do something like this:: from twisted.web2 import http f = http.HTTPFactory() f.protocol = Proxy Make the HTTPFactory a listener on a port as per usual, and you have a fully-functioning web proxy! """ requestFactory = ProxyRequest class ReverseProxyRequest(http.Request): """Used by ReverseProxy to implement a simple reverse proxy.""" def process(self): self.received_headers['host'] = self.factory.host clientFactory = ProxyClientFactory(self.method, self.uri, self.clientproto, self.getAllHeaders(), self.content.read(), self) reactor.connectTCP(self.factory.host, self.factory.port, clientFactory) class ReverseProxy(http.HTTPChannel): """Implements a simple reverse proxy. For details of usage, see the file examples/proxy.py""" requestFactory = ReverseProxyRequest class IConnector(Interface): """attribute name""" def connect(factory): """connect ClientFactory""" class TCPConnector: implements(IConnector) def __init__(self, host, port): self.host = host self.name = host self.port = port def connect(self, factory): reactor.connectTCP(self.host, self.port, factory) class UNIXConnector: implements(IConnector) name = 'n/a' def __init__(self, socket): self.socket = socket def connect(self, factory): reactor.connectUNIX(self.socket, factory) def ReverseProxyResource(host, port, path): return ReverseProxyResourceConnector(TCPConnector(host, port), path) class ReverseProxyResourceConnector: """Resource that renders the results gotten from another server Put this resource in the tree to cause everything below it to be relayed to a different server. """ isLeaf = True implements(resource.IResource) def __init__(self, connector, path): self.connector = connector self.path = path def render(self, request): request.received_headers['host'] = self.connector.name request.content.seek(0, 0) qs = urlparse.urlparse(request.uri)[4] path = self.path+'/'.join(request.postpath) if qs: rest = path + '?' + qs else: rest = path clientFactory = ProxyClientFactory(request.method, rest, request.clientproto, request.getAllHeaders(), request.content.read(), request) self.connector.connect(clientFactory) return server.NOT_DONE_YET TwistedWeb2-8.1.0/twisted/web2/compat.py0000644000175000017500000003437710374011301016520 0ustar dokodokofrom __future__ import generators from urllib import quote, string import UserDict, math, time from cStringIO import StringIO from twisted.web2 import http_headers, iweb, stream, responsecode from twisted.internet import defer, address from twisted.python import components from twisted.spread import pb from zope.interface import implements class HeaderAdapter(UserDict.DictMixin): def __init__(self, headers): self._headers = headers def __getitem__(self, name): raw = self._headers.getRawHeaders(name) if raw is None: raise KeyError(name) return ', '.join(raw) def __setitem__(self, name, value): self._headers.setRawHeaders([value]) def __delitem__(self, name): if not self._headers.hasHeader(name): raise KeyError(name) self._headers.removeHeader(name) def iteritems(self): for k,v in self._headers.getAllRawHeaders(): yield k, ', '.join(v) def keys(self): return [k for k, _ in self.iteritems()] def __iter__(self): for k, _ in self.iteritems(): yield k def has_key(self, name): return self._headers.hasHeader(name) def makeOldRequestAdapter(original): # Cache the adapter. Replace this with a more better generalized # mechanism when one becomes available. if not hasattr(original, '_oldRequest'): original._oldRequest = OldRequestAdapter(original) return original._oldRequest def _addressToTuple(addr): if isinstance(addr, address.IPv4Address): return ('INET', addr.host, addr.port) elif isinstance(addr, address.UNIXAddress): return ('UNIX', addr.name) else: return tuple(addr) class OldRequestAdapter(pb.Copyable, components.Componentized, object): """Adapt old requests to new request """ implements(iweb.IOldRequest) def _getFrom(where, name): def _get(self): return getattr(getattr(self, where), name) return property(_get) def _getsetFrom(where, name): def _get(self): return getattr(getattr(self, where), name) def _set(self, new): setattr(getattr(self, where), name, new) def _del(self): delattr(getattr(self, where), name) return property(_get, _set, _del) def _getsetHeaders(where): def _get(self): headers = getattr(self, where).headers return HeaderAdapter(headers) def _set(self, newheaders): headers = http_headers.Headers() for n,v in newheaders.items(): headers.setRawHeaders(n, (v,)) newheaders = headers getattr(self, where).headers = newheaders return property(_get, _set) code = _getsetFrom('response', 'code') code_message = "" method = _getsetFrom('request', 'method') uri = _getsetFrom('request', 'uri') def _getClientproto(self): return "HTTP/%d.%d" % self.request.clientproto clientproto = property(_getClientproto) received_headers = _getsetHeaders('request') headers = _getsetHeaders('response') path = _getsetFrom('request', 'path') # cookies = # Do I need this? # received_cookies = # Do I need this? content = StringIO() #### FIXME args = _getsetFrom('request', 'args') # stack = # WTF is stack? prepath = _getsetFrom('request', 'prepath') postpath = _getsetFrom('request', 'postpath') def _getClient(self): return "WTF" client = property(_getClient) def _getHost(self): return address.IPv4Address("TCP", self.request.host, self.request.port) host = property(_getHost) def __init__(self, request): from twisted.web2 import http components.Componentized.__init__(self) self.request = request self.response = http.Response(stream=stream.ProducerStream()) # This deferred will be fired by the first call to write on OldRequestAdapter # and will cause the headers to be output. self.deferredResponse = defer.Deferred() def getStateToCopyFor(self, issuer): # This is for distrib compatibility x = {} x['prepath'] = self.prepath x['postpath'] = self.postpath x['method'] = self.method x['uri'] = self.uri x['clientproto'] = self.clientproto self.content.seek(0, 0) x['content_data'] = self.content.read() x['remote'] = pb.ViewPoint(issuer, self) x['host'] = _addressToTuple(self.request.chanRequest.channel.transport.getHost()) x['client'] = _addressToTuple(self.request.chanRequest.channel.transport.getPeer()) return x def getTypeToCopy(self): # lie to PB so the ResourcePublisher doesn't have to know web2 exists # which is good because web2 doesn't exist. return 'twisted.web.server.Request' def registerProducer(self, producer, streaming): self.response.stream.registerProducer(producer, streaming) def unregisterProducer(self): self.response.stream.unregisterProducer() def finish(self): if self.deferredResponse is not None: d = self.deferredResponse self.deferredResponse = None d.callback(self.response) self.response.stream.finish() def write(self, data): if self.deferredResponse is not None: d = self.deferredResponse self.deferredResponse = None d.callback(self.response) self.response.stream.write(data) def getHeader(self, name): raw = self.request.headers.getRawHeaders(name) if raw is None: return None return ', '.join(raw) def setHeader(self, name, value): """Set an outgoing HTTP header. """ self.response.headers.setRawHeaders(name, [value]) def setResponseCode(self, code, message=None): # message ignored self.response.code = code def setLastModified(self, when): # Never returns CACHED -- can it and still be compliant? when = long(math.ceil(when)) self.response.headers.setHeader('last-modified', when) return None def setETag(self, etag): self.response.headers.setRawHeaders('etag', [etag]) return None def getAllHeaders(self): return dict(self.headers.iteritems()) def getRequestHostname(self): return self.request.host def getCookie(self, key): for cookie in self.request.headers.getHeader('cookie', ()): if cookie.name == key: return cookie.value return None def addCookie(self, k, v, expires=None, domain=None, path=None, max_age=None, comment=None, secure=None): if expires is None and max_age is not None: expires=max_age-time.time() cookie = http_headers.Cookie(k,v, expires=expires, domain=domain, path=path, comment=comment, secure=secure) self.response.headers.setHeader('set-cookie', self.request.headers.getHeader('set-cookie', ())+(cookie,)) def notifyFinish(self): ### FIXME return None # return self.request.notifyFinish() def getHost(self): return self.host def setHost(self, host, port, ssl=0): self.request.host = host self.request.port = port self.request.scheme = ssl and 'https' or 'http' def isSecure(self): return self.request.scheme == 'https' def getClientIP(self): if isinstance(self.request.chanRequest.getRemoteHost(), address.IPv4Address): return self.client.host else: return None return self.request.chanRequest.getRemoteHost() return "127.0.0.1" def getClient(self): return "127.0.0.1" ### FIXME: def getUser(self): return "" def getPassword(self): return "" # Identical to original methods -- hopefully these don't have to change def sibLink(self, name): "Return the text that links to a sibling of the requested resource." if self.postpath: return (len(self.postpath)*"../") + name else: return name def childLink(self, name): "Return the text that links to a child of the requested resource." lpp = len(self.postpath) if lpp > 1: return ((lpp-1)*"../") + name elif lpp == 1: return name else: # lpp == 0 if len(self.prepath) and self.prepath[-1]: return self.prepath[-1] + '/' + name else: return name def redirect(self, url): """Utility function that does a redirect. The request should have finish() called after this. """ self.setResponseCode(responsecode.FOUND) self.setHeader("location", url) def prePathURL(self): port = self.getHost().port if self.isSecure(): default = 443 else: default = 80 if port == default: hostport = '' else: hostport = ':%d' % port return quote('http%s://%s%s/%s' % ( self.isSecure() and 's' or '', self.getRequestHostname(), hostport, string.join(self.prepath, '/')), "/:") # def URLPath(self): # from twisted.python import urlpath # return urlpath.URLPath.fromRequest(self) # But nevow wants it to look like this... :( def URLPath(self): from nevow import url return url.URL.fromContext(self) def rememberRootURL(self, url=None): """ Remember the currently-processed part of the URL for later recalling. """ if url is None: url = self.prePathURL() # remove one segment self.appRootURL = url[:url.rindex("/")] else: self.appRootURL = url def getRootURL(self): """ Get a previously-remembered URL. """ return self.appRootURL session = None def getSession(self, sessionInterface = None): # Session management if not self.session: # FIXME: make sitepath be something cookiename = string.join(['TWISTED_SESSION'] + self.sitepath, "_") sessionCookie = self.getCookie(cookiename) if sessionCookie: try: self.session = self.site.getSession(sessionCookie) except KeyError: pass # if it still hasn't been set, fix it up. if not self.session: self.session = self.site.makeSession() self.addCookie(cookiename, self.session.uid, path='/') self.session.touch() if sessionInterface: return self.session.getComponent(sessionInterface) return self.session class OldNevowResourceAdapter(object): implements(iweb.IResource) def __init__(self, original): # Can't use self.__original= because of __setattr__. self.__dict__['_OldNevowResourceAdapter__original']=original def __getattr__(self, name): return getattr(self.__original, name) def __setattr__(self, name, value): setattr(self.__original, name, value) def __delattr__(self, name): delattr(self.__original, name) def locateChild(self, ctx, segments): from twisted.web2.server import parsePOSTData request = iweb.IRequest(ctx) if request.method == "POST": return parsePOSTData(request).addCallback( lambda x: self.__original.locateChild(ctx, segments)) return self.__original.locateChild(ctx, segments) def renderHTTP(self, ctx): from twisted.web2.server import parsePOSTData request = iweb.IRequest(ctx) if request.method == "POST": return parsePOSTData(request).addCallback(self.__reallyRender, ctx) return self.__reallyRender(None, ctx) def __reallyRender(self, ignored, ctx): # This deferred will be called when our resource is _finished_ # writing, and will make sure we write the rest of our data # and finish the connection. defer.maybeDeferred(self.__original.renderHTTP, ctx).addCallback(self.__finish, ctx) # Sometimes the __original.renderHTTP will write() before we # even get this far, and we don't want to return # oldRequest.deferred if it's already been set to None. oldRequest = iweb.IOldRequest(ctx) if oldRequest.deferredResponse is None: return oldRequest.response return oldRequest.deferredResponse def __finish(self, data, ctx): oldRequest = iweb.IOldRequest(ctx) oldRequest.write(data) oldRequest.finish() class OldResourceAdapter(object): implements(iweb.IOldNevowResource) def __init__(self, original): self.original = original def __repr__(self): return "<%s @ 0x%x adapting %r>" % (self.__class__.__name__, id(self), self.original) def locateChild(self, req, segments): import server request = iweb.IOldRequest(req) if self.original.isLeaf: return self, server.StopTraversal name = segments[0] if name == '': res = self else: request.prepath.append(request.postpath.pop(0)) res = self.original.getChildWithDefault(name, request) request.postpath.insert(0, request.prepath.pop()) if isinstance(res, defer.Deferred): return res.addCallback(lambda res: (res, segments[1:])) return res, segments[1:] def _handle_NOT_DONE_YET(self, data, request): from twisted.web.server import NOT_DONE_YET if data == NOT_DONE_YET: # Return a deferred that will never fire, so the finish # callback doesn't happen. This is because, when returning # NOT_DONE_YET, the page is responsible for calling finish. return defer.Deferred() else: return data def renderHTTP(self, req): request = iweb.IOldRequest(req) result = defer.maybeDeferred(self.original.render, request).addCallback( self._handle_NOT_DONE_YET, request) return result __all__ = [] TwistedWeb2-8.1.0/twisted/web2/log.py0000644000175000017500000001341210456304373016020 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_log -*- # Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """Logging tools. This is still in flux (even moreso than the rest of web2).""" import time from twisted.python import log as tlog from twisted.internet import defer from twisted.web2 import iweb, stream, resource from zope.interface import implements, Attribute, Interface class _LogByteCounter(object): implements(stream.IByteStream) def __init__(self, stream, done): self.stream=stream self.done=done self.len=0 length=property(lambda self: self.stream.length) def _callback(self, data): if data is None: if self.done: done=self.done; self.done=None done(True, self.len) else: self.len += len(data) return data def read(self): data = self.stream.read() if isinstance(data, defer.Deferred): return data.addCallback(self._callback) return self._callback(data) def close(self): if self.done: done=self.done; self.done=None done(False, self.len) self.stream.close() class ILogInfo(Interface): """Auxilliary information about the response useful for logging.""" bytesSent=Attribute("Number of bytes sent.") responseCompleted=Attribute("Whether or not the response was completed.") secondsTaken=Attribute("Number of seconds taken to serve the request.") startTime=Attribute("Time at which the request started") class LogInfo(object): implements(ILogInfo) responseCompleted=None secondsTaken=None bytesSent=None startTime=None def logFilter(request, response, startTime=None): if startTime is None: startTime = time.time() def _log(success, length): loginfo=LogInfo() loginfo.bytesSent=length loginfo.responseCompleted=success loginfo.secondsTaken=time.time()-startTime tlog.msg(interface=iweb.IRequest, request=request, response=response, loginfo=loginfo) # Or just... # ILogger(ctx).log(...) ? if response.stream: response.stream=_LogByteCounter(response.stream, _log) else: _log(True, 0) return response logFilter.handleErrors = True class LogWrapperResource(resource.WrapperResource): def hook(self, request): # Insert logger request.addResponseFilter(logFilter, atEnd=True) monthname = [None, 'Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'] class BaseCommonAccessLoggingObserver(object): """An abstract Twisted-based logger for creating access logs. Derived implementations of this class *must* implement the ``logMessage(message)`` method, which will send the message to an actual log/file or stream. """ logFormat = '%s - %s [%s] "%s" %s %d "%s" "%s"' def logMessage(self, message): raise NotImplemented, 'You must provide an implementation.' def computeTimezoneForLog(self, tz): if tz > 0: neg = 1 else: neg = 0 tz = -tz h, rem = divmod(tz, 3600) m, rem = divmod(rem, 60) if neg: return '-%02d%02d' % (h, m) else: return '+%02d%02d' % (h, m) tzForLog = None tzForLogAlt = None def logDateString(self, when): logtime = time.localtime(when) Y, M, D, h, m, s = logtime[:6] if not time.daylight: tz = self.tzForLog if tz is None: tz = self.computeTimezoneForLog(time.timezone) self.tzForLog = tz else: tz = self.tzForLogAlt if tz is None: tz = self.computeTimezoneForLog(time.altzone) self.tzForLogAlt = tz return '%02d/%s/%02d:%02d:%02d:%02d %s' % ( D, monthname[M], Y, h, m, s, tz) def emit(self, eventDict): if eventDict.get('interface') is not iweb.IRequest: return request = eventDict['request'] response = eventDict['response'] loginfo = eventDict['loginfo'] firstLine = '%s %s HTTP/%s' %( request.method, request.uri, '.'.join([str(x) for x in request.clientproto])) self.logMessage( '%s - %s [%s] "%s" %s %d "%s" "%s"' %( request.remoteAddr.host, # XXX: Where to get user from? "-", self.logDateString( response.headers.getHeader('date', 0)), firstLine, response.code, loginfo.bytesSent, request.headers.getHeader('referer', '-'), request.headers.getHeader('user-agent', '-') ) ) def start(self): """Start observing log events.""" tlog.addObserver(self.emit) def stop(self): """Stop observing log events.""" tlog.removeObserver(self.emit) class FileAccessLoggingObserver(BaseCommonAccessLoggingObserver): """I log requests to a single logfile """ def __init__(self, logpath): self.logpath = logpath def logMessage(self, message): self.f.write(message + '\n') def start(self): super(FileAccessLoggingObserver, self).start() self.f = open(self.logpath, 'a', 1) def stop(self): super(FileAccessLoggingObserver, self).stop() self.f.close() class DefaultCommonAccessLoggingObserver(BaseCommonAccessLoggingObserver): """Log requests to default twisted logfile.""" def logMessage(self, message): tlog.msg(message) TwistedWeb2-8.1.0/twisted/web2/fileupload.py0000644000175000017500000002762710514325306017372 0ustar dokodokofrom __future__ import generators import re from zope.interface import implements import urllib import tempfile from twisted.internet import defer from twisted.web2.stream import IStream, FileStream, BufferedStream, readStream from twisted.web2.stream import generatorToStream, readAndDiscard from twisted.web2 import http_headers from cStringIO import StringIO ################################### ##### Multipart MIME Reader ##### ################################### class MimeFormatError(Exception): pass # parseContentDispositionFormData is absolutely horrible, but as # browsers don't seem to believe in sensible quoting rules, it's # really the only way to handle the header. (Quotes can be in the # filename, unescaped) cd_regexp = re.compile( ' *form-data; *name="([^"]*)"(?:; *filename="(.*)")?$', re.IGNORECASE) def parseContentDispositionFormData(value): match = cd_regexp.match(value) if not match: # Error parsing. raise ValueError("Unknown content-disposition format.") name=match.group(1) filename=match.group(2) return name, filename #@defer.deferredGenerator def _readHeaders(stream): """Read the MIME headers. Assumes we've just finished reading in the boundary string.""" ctype = fieldname = filename = None headers = [] # Now read headers while 1: line = stream.readline(size=1024) if isinstance(line, defer.Deferred): line = defer.waitForDeferred(line) yield line line = line.getResult() #print "GOT", line if not line.endswith('\r\n'): if line == "": raise MimeFormatError("Unexpected end of stream.") else: raise MimeFormatError("Header line too long") line = line[:-2] # strip \r\n if line == "": break # End of headers parts = line.split(':', 1) if len(parts) != 2: raise MimeFormatError("Header did not have a :") name, value = parts name = name.lower() headers.append((name, value)) if name == "content-type": ctype = http_headers.parseContentType(http_headers.tokenize((value,), foldCase=False)) elif name == "content-disposition": fieldname, filename = parseContentDispositionFormData(value) if ctype is None: ctype == http_headers.MimeType('application', 'octet-stream') if fieldname is None: raise MimeFormatError('Content-disposition invalid or omitted.') # End of headers, return (field name, content-type, filename) yield fieldname, filename, ctype return _readHeaders = defer.deferredGenerator(_readHeaders) class _BoundaryWatchingStream(object): def __init__(self, stream, boundary): self.stream = stream self.boundary = boundary self.data = '' self.deferred = defer.Deferred() length = None # unknown def read(self): if self.stream is None: if self.deferred is not None: deferred = self.deferred self.deferred = None deferred.callback(None) return None newdata = self.stream.read() if isinstance(newdata, defer.Deferred): return newdata.addCallbacks(self._gotRead, self._gotError) return self._gotRead(newdata) def _gotRead(self, newdata): if not newdata: raise MimeFormatError("Unexpected EOF") # BLECH, converting buffer back into string. self.data += str(newdata) data = self.data boundary = self.boundary off = data.find(boundary) if off == -1: # No full boundary, check for the first character off = data.rfind(boundary[0], max(0, len(data)-len(boundary))) if off != -1: # We could have a partial boundary, store it for next time self.data = data[off:] return data[:off] else: self.data = '' return data else: self.stream.pushback(data[off+len(boundary):]) self.stream = None return data[:off] def _gotError(self, err): # Propogate error back to MultipartMimeStream also if self.deferred is not None: deferred = self.deferred self.deferred = None deferred.errback(err) return err def close(self): # Assume error will be raised again and handled by MMS? readAndDiscard(self).addErrback(lambda _: None) class MultipartMimeStream(object): implements(IStream) def __init__(self, stream, boundary): self.stream = BufferedStream(stream) self.boundary = "--"+boundary self.first = True def read(self): """ Return a deferred which will fire with a tuple of: (fieldname, filename, ctype, dataStream) or None when all done. Format errors will be sent to the errback. Returns None when all done. IMPORTANT: you *must* exhaust dataStream returned by this call before calling .read() again! """ if self.first: self.first = False d = self._readFirstBoundary() else: d = self._readBoundaryLine() d.addCallback(self._doReadHeaders) d.addCallback(self._gotHeaders) return d def _readFirstBoundary(self): #print "_readFirstBoundary" line = self.stream.readline(size=1024) if isinstance(line, defer.Deferred): line = defer.waitForDeferred(line) yield line line = line.getResult() if line != self.boundary + '\r\n': raise MimeFormatError("Extra data before first boundary: %r looking for: %r" % (line, self.boundary + '\r\n')) self.boundary = "\r\n"+self.boundary yield True return _readFirstBoundary = defer.deferredGenerator(_readFirstBoundary) def _readBoundaryLine(self): #print "_readBoundaryLine" line = self.stream.readline(size=1024) if isinstance(line, defer.Deferred): line = defer.waitForDeferred(line) yield line line = line.getResult() if line == "--\r\n": # THE END! yield False return elif line != "\r\n": raise MimeFormatError("Unexpected data on same line as boundary: %r" % (line,)) yield True return _readBoundaryLine = defer.deferredGenerator(_readBoundaryLine) def _doReadHeaders(self, morefields): #print "_doReadHeaders", morefields if not morefields: return None return _readHeaders(self.stream) def _gotHeaders(self, headers): if headers is None: return None bws = _BoundaryWatchingStream(self.stream, self.boundary) self.deferred = bws.deferred ret=list(headers) ret.append(bws) return tuple(ret) def readIntoFile(stream, outFile, maxlen): """Read the stream into a file, but not if it's longer than maxlen. Returns Deferred which will be triggered on finish. """ curlen = [0] def done(_): return _ def write(data): curlen[0] += len(data) if curlen[0] > maxlen: raise MimeFormatError("Maximum length of %d bytes exceeded." % maxlen) outFile.write(data) return readStream(stream, write).addBoth(done) #@defer.deferredGenerator def parseMultipartFormData(stream, boundary, maxMem=100*1024, maxFields=1024, maxSize=10*1024*1024): # If the stream length is known to be too large upfront, abort immediately if stream.length is not None and stream.length > maxSize: raise MimeFormatError("Maximum length of %d bytes exceeded." % maxSize) mms = MultipartMimeStream(stream, boundary) numFields = 0 args = {} files = {} while 1: datas = mms.read() if isinstance(datas, defer.Deferred): datas = defer.waitForDeferred(datas) yield datas datas = datas.getResult() if datas is None: break numFields+=1 if numFields == maxFields: raise MimeFormatError("Maximum number of fields %d exceeded"%maxFields) # Parse data fieldname, filename, ctype, stream = datas if filename is None: # Not a file outfile = StringIO() maxBuf = min(maxSize, maxMem) else: outfile = tempfile.NamedTemporaryFile() maxBuf = maxSize x = readIntoFile(stream, outfile, maxBuf) if isinstance(x, defer.Deferred): x = defer.waitForDeferred(x) yield x x = x.getResult() if filename is None: # Is a normal form field outfile.seek(0) data = outfile.read() args.setdefault(fieldname, []).append(data) maxMem -= len(data) maxSize -= len(data) else: # Is a file upload maxSize -= outfile.tell() outfile.seek(0) files.setdefault(fieldname, []).append((filename, ctype, outfile)) yield args, files return parseMultipartFormData = defer.deferredGenerator(parseMultipartFormData) ################################### ##### x-www-urlencoded reader ##### ################################### def parse_urlencoded_stream(input, maxMem=100*1024, keep_blank_values=False, strict_parsing=False): lastdata = '' still_going=1 while still_going: try: yield input.wait data = input.next() except StopIteration: pairs = [lastdata] still_going=0 else: maxMem -= len(data) if maxMem < 0: raise MimeFormatError("Maximum length of %d bytes exceeded." % maxMem) pairs = str(data).split('&') pairs[0] = lastdata + pairs[0] lastdata=pairs.pop() for name_value in pairs: nv = name_value.split('=', 1) if len(nv) != 2: if strict_parsing: raise MimeFormatError("bad query field: %s") % `name_value` continue if len(nv[1]) or keep_blank_values: name = urllib.unquote(nv[0].replace('+', ' ')) value = urllib.unquote(nv[1].replace('+', ' ')) yield name, value parse_urlencoded_stream = generatorToStream(parse_urlencoded_stream) def parse_urlencoded(stream, maxMem=100*1024, maxFields=1024, keep_blank_values=False, strict_parsing=False): d = {} numFields = 0 s=parse_urlencoded_stream(stream, maxMem, keep_blank_values, strict_parsing) while 1: datas = s.read() if isinstance(datas, defer.Deferred): datas = defer.waitForDeferred(datas) yield datas datas = datas.getResult() if datas is None: break name, value = datas numFields += 1 if numFields == maxFields: raise MimeFormatError("Maximum number of fields %d exceeded"%maxFields) if name in d: d[name].append(value) else: d[name] = [value] yield d return parse_urlencoded = defer.deferredGenerator(parse_urlencoded) if __name__ == '__main__': d = parseMultipartFormData( FileStream(open("upload.txt")), "----------0xKhTmLbOuNdArY") from twisted.python import log d.addErrback(log.err) def pr(s): print s d.addCallback(pr) __all__ = ['parseMultipartFormData', 'parse_urlencoded', 'parse_urlencoded_stream', 'MultipartMimeStream', 'MimeFormatError'] TwistedWeb2-8.1.0/twisted/web2/_version.py0000644000175000017500000000021211014046476017053 0ustar dokodoko# This is an auto-generated file. Do not edit it. from twisted.python import versions version = versions.Version('twisted.web2', 8, 1, 0) TwistedWeb2-8.1.0/twisted/web2/resource.py0000644000175000017500000002405210767572311017074 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_server,twisted.web2.test.test_resource -*- # Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ I hold the lowest-level L{Resource} class and related mix-in classes. """ # System Imports from zope.interface import implements from twisted.web2 import iweb, http, server, responsecode class RenderMixin(object): """ Mix-in class for L{iweb.IResource} which provides a dispatch mechanism for handling HTTP methods. """ def allowedMethods(self): """ @return: A tuple of HTTP methods that are allowed to be invoked on this resource. """ if not hasattr(self, "_allowed_methods"): self._allowed_methods = tuple([name[5:] for name in dir(self) if name.startswith('http_')]) return self._allowed_methods def checkPreconditions(self, request): """ Checks all preconditions imposed by this resource upon a request made against it. @param request: the request to process. @raise http.HTTPError: if any precondition fails. @return: C{None} or a deferred whose callback value is C{request}. """ # # http.checkPreconditions() gets called by the server after every # GET or HEAD request. # # For other methods, we need to know to bail out before request # processing, especially for methods that modify server state (eg. PUT). # We also would like to do so even for methods that don't, if those # methods might be expensive to process. We're assuming that GET and # HEAD are not expensive. # if request.method not in ("GET", "HEAD"): http.checkPreconditions(request) # Check per-method preconditions method = getattr(self, "preconditions_" + request.method, None) if method: return method(request) def renderHTTP(self, request): """ See L{iweb.IResource.renderHTTP}. This implementation will dispatch the given C{request} to another method of C{self} named C{http_}METHOD, where METHOD is the HTTP method used by C{request} (eg. C{http_GET}, C{http_POST}, etc.). Generally, a subclass should implement those methods instead of overriding this one. C{http_*} methods are expected provide the same interface and return the same results as L{iweb.IResource}C{.renderHTTP} (and therefore this method). C{etag} and C{last-modified} are added to the response returned by the C{http_*} header, if known. If an appropriate C{http_*} method is not found, a L{responsecode.NOT_ALLOWED}-status response is returned, with an appropriate C{allow} header. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ method = getattr(self, "http_" + request.method, None) if not method: response = http.Response(responsecode.NOT_ALLOWED) response.headers.setHeader("allow", self.allowedMethods()) return response d = self.checkPreconditions(request) if d is None: return method(request) else: return d.addCallback(lambda _: method(request)) def http_OPTIONS(self, request): """ Respond to a OPTIONS request. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ response = http.Response(responsecode.OK) response.headers.setHeader("allow", self.allowedMethods()) return response def http_TRACE(self, request): """ Respond to a TRACE request. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ return server.doTrace(request) def http_HEAD(self, request): """ Respond to a HEAD request. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ return self.http_GET(request) def http_GET(self, request): """ Respond to a GET request. This implementation validates that the request body is empty and then dispatches the given C{request} to L{render} and returns its result. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ if request.stream.length != 0: return responsecode.REQUEST_ENTITY_TOO_LARGE return self.render(request) def render(self, request): """ Subclasses should implement this method to do page rendering. See L{http_GET}. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ raise NotImplementedError("Subclass must implement render method.") class Resource(RenderMixin): """ An L{iweb.IResource} implementation with some convenient mechanisms for locating children. """ implements(iweb.IResource) addSlash = False def locateChild(self, request, segments): """ Locates a child resource of this resource. @param request: the request to process. @param segments: a sequence of URL path segments. @return: a tuple of C{(child, segments)} containing the child of this resource which matches one or more of the given C{segments} in sequence, and a list of remaining segments. """ w = getattr(self, 'child_%s' % (segments[0], ), None) if w: r = iweb.IResource(w, None) if r: return r, segments[1:] return w(request), segments[1:] factory = getattr(self, 'childFactory', None) if factory is not None: r = factory(request, segments[0]) if r: return r, segments[1:] return None, [] def child_(self, request): """ This method locates a child with a trailing C{"/"} in the URL. @param request: the request to process. """ if self.addSlash and len(request.postpath) == 1: return self return None def putChild(self, path, child): """ Register a static child. This implementation registers children by assigning them to attributes with a C{child_} prefix. C{resource.putChild("foo", child)} is therefore same as C{o.child_foo = child}. @param path: the name of the child to register. You almost certainly don't want C{"/"} in C{path}. If you want to add a "directory" resource (e.g. C{/foo/}) specify C{path} as C{""}. @param child: an object adaptable to L{iweb.IResource}. """ setattr(self, 'child_%s' % (path, ), child) def http_GET(self, request): if self.addSlash and request.prepath[-1] != '': # If this is a directory-ish resource... return http.RedirectResponse(request.unparseURL(path=request.path+'/')) return super(Resource, self).http_GET(request) class PostableResource(Resource): """ A L{Resource} capable of handling the POST request method. @cvar maxMem: maximum memory used during the parsing of the data. @type maxMem: C{int} @cvar maxFields: maximum number of form fields allowed. @type maxFields: C{int} @cvar maxSize: maximum size of the whole post allowed. @type maxSize: C{int} """ maxMem = 100 * 1024 maxFields = 1024 maxSize = 10 * 1024 * 1024 def http_POST(self, request): """ Respond to a POST request. Reads and parses the incoming body data then calls L{render}. @param request: the request to process. @return: an object adaptable to L{iweb.IResponse}. """ return server.parsePOSTData(request, self.maxMem, self.maxFields, self.maxSize ).addCallback(lambda res: self.render(request)) class LeafResource(RenderMixin): """ A L{Resource} with no children. """ implements(iweb.IResource) def locateChild(self, request, segments): return self, server.StopTraversal class RedirectResource(LeafResource): """ A L{LeafResource} which always performs a redirect. """ implements(iweb.IResource) def __init__(self, *args, **kwargs): """ Parameters are URL components and are the same as those for L{urlparse.urlunparse}. URL components which are not specified will default to the corresponding component of the URL of the request being redirected. """ self._args = args self._kwargs = kwargs def renderHTTP(self, request): return http.RedirectResponse(request.unparseURL(*self._args, **self._kwargs)) class WrapperResource(object): """ An L{iweb.IResource} implementation which wraps a L{RenderMixin} instance and provides a hook in which a subclass can implement logic that is called before request processing on the contained L{Resource}. """ implements(iweb.IResource) def __init__(self, resource): self.resource=resource def hook(self, request): """ Override this method in order to do something before passing control on to the wrapped resource's C{renderHTTP} and C{locateChild} methods. @return: None or a L{Deferred}. If a deferred object is returned, it's value is ignored, but C{renderHTTP} and C{locateChild} are chained onto the deferred as callbacks. """ raise NotImplementedError() def locateChild(self, request, segments): x = self.hook(request) if x is not None: return x.addCallback(lambda data: (self.resource, segments)) return self.resource, segments def renderHTTP(self, request): x = self.hook(request) if x is not None: return x.addCallback(lambda data: self.resource) return self.resource __all__ = ['RenderMixin', 'Resource', 'PostableResource', 'LeafResource', 'WrapperResource'] TwistedWeb2-8.1.0/twisted/web2/tap.py0000644000175000017500000002001110767246617016027 0ustar dokodokoimport os from zope.interface import implements from twisted.python import usage, reflect from twisted.application import internet, service, strports from twisted.scripts.mktap import IServiceMaker from twisted.plugin import IPlugin from twisted.web2 import static, iweb, log, server, channel, vhost class Options(usage.Options): optParameters = [["port", "p", "8080", "Port to start the server on."], ["logfile", "l", None, ("Common Access Logging Format file to write to " "if unspecified access log information will be " "written to the standard twisted log file.")], ["https", None, None, "Port to listen on for Secure HTTP."], ["certificate", "c", "server.pem", "SSL certificate to use for HTTPS."], ["privkey", "k", "server.pem", "SSL certificate to use for HTTPS."]] zsh_actions = {"certificate" : "_files -g '*.pem'", "privkey" : "_files -g '*.pem'"} longdesc = """\ This creates a web2.tap file that can be used by twistd. Basic Examples: To serve a static directory or file: mktap web2 --path=/tmp/ To serve a dynamic resource: mktap web2 --class=fully.qualified.ClassName To serve a directory of the form: /var/www/domain1/ /var/www/domain2/ mktap web2 --vhost-path=/var/www/ All the above options are incompatible as they all specify the root resource. However you can use the following options in conjunction with --vhost-path To serve a specific host name as a static file: mktap web2 --vhost-static=domain3=/some/other/root/domain3 Or to serve a specific host name as a dynamic resource: mktap web2 --vhost-class=domain4=fully.qualified.ClassName """ def __init__(self): usage.Options.__init__(self) self['indexes'] = [] self['root'] = None def opt_index(self, indexName): """Add the name of a file used to check for directory indexes. [default: index, index.html] """ self['indexes'].append(indexName) opt_i = opt_index def opt_path(self, path): """A path that will be used to serve the root resource as a raw file or directory. """ if self['root']: raise usage.UsageError("You may only have one root resource.") self['root'] = static.File(os.path.abspath(path)) def opt_processor(self, proc): """`ext=class' where `class' is added as a Processor for files ending with `ext'. """ if not isinstance(self['root'], static.File): raise usage.UsageError("You can only use --processor after --path.") ext, klass = proc.split('=', 1) self['root'].processors[ext] = reflect.namedClass(klass) def opt_class(self, className): """A class that will be used to serve the root resource. Must implement twisted.web2.iweb.IResource and take no arguments. """ if self['root']: raise usage.UsageError("You may only have one root resource.") classObj = reflect.namedClass(className) self['root'] = iweb.IResource(classObj()) def opt_allow_ignore_ext(self): """Specify whether or not a request for 'foo' should return 'foo.ext'""" if not isinstance(self['root'], static.File): raise usage.UsageError("You can only use --allow_ignore_ext " "after --path.") self['root'].ignoreExt('*') def opt_ignore_ext(self, ext): """Specify an extension to ignore. These will be processed in order. """ if not isinstance(self['root'], static.File): raise usage.UsageError("You can only use --ignore_ext " "after --path.") self['root'].ignoreExt(ext) def opt_mimetype(self, mimetype): """Mapping from file extension to MIME Type in the form of 'ext=type'. Example: html=text/html """ if not isinstance(self['root'], static.File): raise usage.UsageError("You can only use --mimetype " "after --path.") ext, mimetype = mimetype.split('=', 1) # this is really gross, there should be a public api for this. self['root']._sharedContentTypes.update({ext: mimetype}) def opt_vhost_path(self, path): """Specify a directory to use for automatic named virtual hosts. It is assumed that this directory contains a series of subdirectories each representing a virtual host domain name and containing the files to be served at that domain. """ if self['root']: if not isintance(self['root'], vhost.NameVirtualHost): raise usage.UsageError("You may only have one root resource") else: self['root'] = vhost.NameVirtualHost() path = os.path.abspath(path) for name in os.listdir(path): fullname = os.path.join(path, name) self['root'].addHost(name, static.File(fullname)) def opt_vhost_static(self, virtualHost): """Specify a virtual host in the form of domain=path to be served as raw directory or file. """ if (self['root'] and not \ isinstance(self['root'], vhost.NameVirtualHost)): raise usage.UsageError("You can only use --vhost-static alone " "or with --vhost-class and --vhost-path") domain, path = virtualHost.split('=', 1) if not self['root']: self['root'] = vhost.NameVirtualHost() self['root'].addHost(domain, static.File(os.path.abspath(path))) def opt_vhost_class(self, virtualHost): """Specify a virtual host in the form of domain=class, where class can be adapted to an iweb.IResource and has a zero-argument constructor. """ if (self['root'] and not \ isinstance(self['root'], vhost.NameVirtualHost)): raise usage.UsageError("You can not use --vhost-class with " "--path or --class.") domain, className = virtualHost.split('=', 1) if not self['root']: self['root'] = vhost.NameVirtualHost() classObj = reflect.namedClass(className) self['root'].addHost(domain, iweb.IResource(classObj())) def postOptions(self): if self['https']: try: from twisted.internet.ssl import DefaultOpenSSLContextFactory except ImportError: raise usage.UsageError("SSL support not installed") class Web2Service(service.MultiService): def __init__(self, logObserver): self.logObserver = logObserver service.MultiService.__init__(self) def startService(self): service.MultiService.startService(self) self.logObserver.start() def stopService(self): service.MultiService.stopService(self) self.logObserver.stop() def makeService(config): if config['logfile']: logObserver = log.FileAccessLoggingObserver(config['logfile']) else: logObserver = log.DefaultCommonAccessLoggingObserver() if config['root']: if config['indexes']: config['root'].indexNames = config['indexes'] root = log.LogWrapperResource(config['root']) s = Web2Service(logObserver) site = server.Site(root) chan = channel.HTTPFactory(site) if config['https']: from twisted.internet.ssl import DefaultOpenSSLContextFactory i = internet.SSLServer(int(config['https']), chan, DefaultOpenSSLContextFactory(config['privkey'], config['certificate'])) i.setServiceParent(s) strports.service(config['port'], chan ).setServiceParent(s) return s TwistedWeb2-8.1.0/twisted/web2/iweb.py0000644000175000017500000003212010526121003016143 0ustar dokodoko# -*- test-case-name: twisted.web2.test -*- """ I contain the interfaces for several web related objects including IRequest and IResource. I am based heavily on ideas from nevow.inevow """ from zope.interface import Attribute, Interface, interface # server.py interfaces class IResource(Interface): """ An HTTP resource. I serve 2 main purposes: one is to provide a standard representation for what HTTP specification calls an 'entity', and the other is to provide an mechanism for mapping URLs to content. """ def locateChild(req, segments): """Locate another object which can be adapted to IResource. @return: A 2-tuple of (resource, remaining-path-segments), or a deferred which will fire the above. Causes the object publishing machinery to continue on with specified resource and segments, calling the appropriate method on the specified resource. If you return (self, L{server.StopTraversal}), this instructs web2 to immediately stop the lookup stage, and switch to the rendering stage, leaving the remaining path alone for your render function to handle. """ def renderHTTP(req): """Return an IResponse or a deferred which will fire an IResponse. This response will be written to the web browser which initiated the request. """ # Is there a better way to do this than this funky extra class? _default = object() class SpecialAdaptInterfaceClass(interface.InterfaceClass): # A special adapter for IResource to handle the extra step of adapting # from IOldNevowResource-providing resources. def __call__(self, other, alternate=_default): result = super(SpecialAdaptInterfaceClass, self).__call__(other, alternate) if result is not alternate: return result result = IOldNevowResource(other, alternate) if result is not alternate: result = IResource(result) return result if alternate is not _default: return alternate raise TypeError('Could not adapt', other, self) IResource.__class__ = SpecialAdaptInterfaceClass class IOldNevowResource(Interface): # Shared interface with inevow.IResource """ I am a web resource. """ def locateChild(ctx, segments): """Locate another object which can be adapted to IResource Return a tuple of resource, path segments """ def renderHTTP(ctx): """Return a string or a deferred which will fire a string. This string will be written to the web browser which initiated this request. Unlike iweb.IResource, this expects the incoming data to have already been read and parsed into request.args and request.content, and expects to return a string instead of a response object. """ class ICanHandleException(Interface): # Shared interface with inevow.ICanHandleException def renderHTTP_exception(request, failure): """Render an exception to the given request object. """ def renderInlineException(request, reason): """Return stan representing the exception, to be printed in the page, not replacing the page.""" # http.py interfaces class IResponse(Interface): """I'm a response.""" code = Attribute("The HTTP response code") headers = Attribute("A http_headers.Headers instance of headers to send") stream = Attribute("A stream.IByteStream of outgoing data, or else None.") class IRequest(Interface): """I'm a request for a web resource """ method = Attribute("The HTTP method from the request line, e.g. GET") uri = Attribute("The raw URI from the request line. May or may not include host.") clientproto = Attribute("Protocol from the request line, e.g. HTTP/1.1") headers = Attribute("A http_headers.Headers instance of incoming headers.") stream = Attribute("A stream.IByteStream of incoming data.") def writeResponse(response): """Write an IResponse object to the client""" chanRequest = Attribute("The ChannelRequest. I wonder if this is public really?") class IOldRequest(Interface): # Shared interface with inevow.ICurrentSegments """An old HTTP request. Subclasses should override the process() method to determine how the request will be processed. @ivar method: The HTTP method that was used. @ivar uri: The full URI that was requested (includes arguments). @ivar path: The path only (arguments not included). @ivar args: All of the arguments, including URL and POST arguments. @type args: A mapping of strings (the argument names) to lists of values. i.e., ?foo=bar&foo=baz&quux=spam results in {'foo': ['bar', 'baz'], 'quux': ['spam']}. @ivar received_headers: All received headers """ # Methods for received request def getHeader(key): """Get a header that was sent from the network. """ def getCookie(key): """Get a cookie that was sent from the network. """ def getAllHeaders(): """Return dictionary of all headers the request received.""" def getRequestHostname(): """Get the hostname that the user passed in to the request. This will either use the Host: header (if it is available) or the host we are listening on if the header is unavailable. """ def getHost(): """Get my originally requesting transport's host. Don't rely on the 'transport' attribute, since Request objects may be copied remotely. For information on this method's return value, see twisted.internet.tcp.Port. """ def getClientIP(): pass def getClient(): pass def getUser(): pass def getPassword(): pass def isSecure(): pass def getSession(sessionInterface = None): pass def URLPath(): pass def prePathURL(): pass def rememberRootURL(): """ Remember the currently-processed part of the URL for later recalling. """ def getRootURL(): """ Get a previously-remembered URL. """ # Methods for outgoing request def finish(): """We are finished writing data.""" def write(data): """ Write some data as a result of an HTTP request. The first time this is called, it writes out response data. """ def addCookie(k, v, expires=None, domain=None, path=None, max_age=None, comment=None, secure=None): """Set an outgoing HTTP cookie. In general, you should consider using sessions instead of cookies, see twisted.web.server.Request.getSession and the twisted.web.server.Session class for details. """ def setResponseCode(code, message=None): """Set the HTTP response code. """ def setHeader(k, v): """Set an outgoing HTTP header. """ def redirect(url): """Utility function that does a redirect. The request should have finish() called after this. """ def setLastModified(when): """Set the X{Last-Modified} time for the response to this request. If I am called more than once, I ignore attempts to set Last-Modified earlier, only replacing the Last-Modified time if it is to a later value. If I am a conditional request, I may modify my response code to L{NOT_MODIFIED} if appropriate for the time given. @param when: The last time the resource being returned was modified, in seconds since the epoch. @type when: number @return: If I am a X{If-Modified-Since} conditional request and the time given is not newer than the condition, I return L{http.CACHED} to indicate that you should write no body. Otherwise, I return a false value. """ def setETag(etag): """Set an X{entity tag} for the outgoing response. That's \"entity tag\" as in the HTTP/1.1 X{ETag} header, \"used for comparing two or more entities from the same requested resource.\" If I am a conditional request, I may modify my response code to L{NOT_MODIFIED} or L{PRECONDITION_FAILED}, if appropriate for the tag given. @param etag: The entity tag for the resource being returned. @type etag: string @return: If I am a X{If-None-Match} conditional request and the tag matches one in the request, I return L{http.CACHED} to indicate that you should write no body. Otherwise, I return a false value. """ def setHost(host, port, ssl=0): """Change the host and port the request thinks it's using. This method is useful for working with reverse HTTP proxies (e.g. both Squid and Apache's mod_proxy can do this), when the address the HTTP client is using is different than the one we're listening on. For example, Apache may be listening on https://www.example.com, and then forwarding requests to http://localhost:8080, but we don't want HTML produced by Twisted to say 'http://localhost:8080', they should say 'https://www.example.com', so we do:: request.setHost('www.example.com', 443, ssl=1) This method is experimental. """ class IChanRequestCallbacks(Interface): """The bits that are required of a Request for interfacing with a IChanRequest object""" def __init__(chanRequest, command, path, version, contentLength, inHeaders): """Create a new Request object. @param chanRequest: the IChanRequest object creating this request @param command: the HTTP command e.g. GET @param path: the HTTP path e.g. /foo/bar.html @param version: the parsed HTTP version e.g. (1,1) @param contentLength: how much data to expect, or None if unknown @param inHeaders: the request headers""" def process(): """Process the request. Called as soon as it's possibly reasonable to return a response. handleContentComplete may or may not have been called already.""" def handleContentChunk(data): """Called when a piece of incoming data has been received.""" def handleContentComplete(): """Called when the incoming data stream is finished.""" def connectionLost(reason): """Called if the connection was lost.""" class IChanRequest(Interface): def writeIntermediateResponse(code, headers=None): """Write a non-terminating response. Intermediate responses cannot contain data. If the channel does not support intermediate responses, do nothing. @ivar code: The response code. Should be in the 1xx range. @type code: int @ivar headers: the headers to send in the response @type headers: C{twisted.web.http_headers.Headers} """ pass def writeHeaders(code, headers): """Write a final response. @param code: The response code. Should not be in the 1xx range. @type code: int @param headers: the headers to send in the response. They will be augmented with any connection-oriented headers as necessary for the protocol. @type headers: C{twisted.web.http_headers.Headers} """ pass def write(data): """Write some data. @param data: the data bytes @type data: str """ pass def finish(): """Finish the request, and clean up the connection if necessary. """ pass def abortConnection(): """Forcibly abort the connection without cleanly closing. Use if, for example, you can't write all the data you promised. """ pass def registerProducer(producer, streaming): """Register a producer with the standard API.""" pass def unregisterProducer(): """Unregister a producer.""" pass def getHostInfo(): """Returns a tuple of (address, socket user connected to, boolean, was it secure). Note that this should not necsessarily always return the actual local socket information from twisted. E.g. in a CGI, it should use the variables coming from the invoking script. """ def getRemoteHost(): """Returns an address of the remote host. Like getHostInfo, this information may come from the real socket, or may come from additional information, depending on the transport. """ persistent = Attribute("""Whether this request supports HTTP connection persistence. May be set to False. Should not be set to other values.""") class ISite(Interface): pass __all__ = ['ICanHandleException', 'IChanRequest', 'IChanRequestCallbacks', 'IOldNevowResource', 'IOldRequest', 'IRequest', 'IResource', 'IResponse', 'ISite'] TwistedWeb2-8.1.0/twisted/web2/client/0000755000175000017500000000000011014056216016131 5ustar dokodokoTwistedWeb2-8.1.0/twisted/web2/client/__init__.py0000644000175000017500000000006310376661753020263 0ustar dokodoko""" Twisted.web2.client: Client Implementation """ TwistedWeb2-8.1.0/twisted/web2/client/http.py0000644000175000017500000002631010706063764017501 0ustar dokodoko# -*- test-case-name: twisted.web2.test.test_client -*- # Copyright (c) 2001-2007 Twisted Matrix Laboratories. # See LICENSE for details. """ Client-side HTTP implementation. """ from zope.interface import implements from twisted.internet.defer import Deferred from twisted.protocols.basic import LineReceiver from twisted.protocols.policies import TimeoutMixin from twisted.web2.responsecode import BAD_REQUEST, HTTP_VERSION_NOT_SUPPORTED from twisted.web2.http import parseVersion, Response from twisted.web2.http_headers import Headers from twisted.web2.stream import ProducerStream, StreamProducer, IByteStream from twisted.web2.channel.http import HTTPParser, PERSIST_NO_PIPELINE, PERSIST_PIPELINE from twisted.web2.client.interfaces import IHTTPClientManager class ProtocolError(Exception): """ Exception raised when a HTTP error happened. """ class ClientRequest(object): """ A class for describing an HTTP request to be sent to the server. """ def __init__(self, method, uri, headers, stream): """ @param method: The HTTP method to for this request, ex: 'GET', 'HEAD', 'POST', etc. @type method: C{str} @param uri: The URI of the resource to request, this may be absolute or relative, however the interpretation of this URI is left up to the remote server. @type uri: C{str} @param headers: Headers to be sent to the server. It is important to note that this object does not create any implicit headers. So it is up to the HTTP Client to add required headers such as 'Host'. @type headers: C{dict}, L{twisted.web2.http_headers.Headers}, or C{None} @param stream: Content body to send to the remote HTTP server. @type stream: L{twisted.web2.stream.IByteStream} """ self.method = method self.uri = uri if isinstance(headers, Headers): self.headers = headers else: self.headers = Headers(headers or {}) if stream is not None: self.stream = IByteStream(stream) else: self.stream = None class HTTPClientChannelRequest(HTTPParser): parseCloseAsEnd = True outgoing_version = "HTTP/1.1" chunkedOut = False finished = False closeAfter = False def __init__(self, channel, request, closeAfter): HTTPParser.__init__(self, channel) self.request = request self.closeAfter = closeAfter self.transport = self.channel.transport self.responseDefer = Deferred() def submit(self): l = [] request = self.request if request.method == "HEAD": # No incoming data will arrive. self.length = 0 l.append('%s %s %s\r\n' % (request.method, request.uri, self.outgoing_version)) if request.headers is not None: for name, valuelist in request.headers.getAllRawHeaders(): for value in valuelist: l.append("%s: %s\r\n" % (name, value)) if request.stream is not None: if request.stream.length is not None: l.append("%s: %s\r\n" % ('Content-Length', request.stream.length)) else: # Got a stream with no length. Send as chunked and hope, against # the odds, that the server actually supports chunked uploads. l.append("%s: %s\r\n" % ('Transfer-Encoding', 'chunked')) self.chunkedOut = True if self.closeAfter: l.append("%s: %s\r\n" % ('Connection', 'close')) else: l.append("%s: %s\r\n" % ('Connection', 'Keep-Alive')) l.append("\r\n") self.transport.writeSequence(l) d = StreamProducer(request.stream).beginProducing(self) d.addCallback(self._finish).addErrback(self._error) def registerProducer(self, producer, streaming): """ Register a producer. """ self.transport.registerProducer(producer, streaming) def unregisterProducer(self): self.transport.unregisterProducer() def write(self, data): if not data: return elif self.chunkedOut: self.transport.writeSequence(("%X\r\n" % len(data), data, "\r\n")) else: self.transport.write(data) def _finish(self, x): """ We are finished writing data. """ if self.chunkedOut: # write last chunk and closing CRLF self.transport.write("0\r\n\r\n") self.finished = True self.channel.requestWriteFinished(self) del self.transport def _error(self, err): """ Abort parsing, and depending of the status of the request, either fire the C{responseDefer} if no response has been sent yet, or close the stream. """ self.abortParse() if hasattr(self, 'stream') and self.stream is not None: self.stream.finish(err) else: self.responseDefer.errback(err) def _abortWithError(self, errcode, text): """ Abort parsing by forwarding a C{ProtocolError} to C{_error}. """ self._error(ProtocolError(text)) def connectionLost(self, reason): self._error(reason) def gotInitialLine(self, initialLine): parts = initialLine.split(' ', 2) # Parse the initial request line if len(parts) != 3: self._abortWithError(BAD_REQUEST, "Bad response line: %s" % (initialLine,)) return strversion, self.code, message = parts try: protovers = parseVersion(strversion) if protovers[0] != 'http': raise ValueError() except ValueError: self._abortWithError(BAD_REQUEST, "Unknown protocol: %s" % (strversion,)) return self.version = protovers[1:3] # Ensure HTTP 0 or HTTP 1. if self.version[0] != 1: self._abortWithError(HTTP_VERSION_NOT_SUPPORTED, 'Only HTTP 1.x is supported.') return ## FIXME: Actually creates Response, function is badly named! def createRequest(self): self.stream = ProducerStream(self.length) self.response = Response(self.code, self.inHeaders, self.stream) self.stream.registerProducer(self, True) del self.inHeaders ## FIXME: Actually processes Response, function is badly named! def processRequest(self): self.responseDefer.callback(self.response) def handleContentChunk(self, data): self.stream.write(data) def handleContentComplete(self): self.stream.finish() class EmptyHTTPClientManager(object): """ A dummy HTTPClientManager. It doesn't do any client management, and is meant to be used only when creating an HTTPClientProtocol directly. """ implements(IHTTPClientManager) def clientBusy(self, proto): pass def clientIdle(self, proto): pass def clientPipelining(self, proto): pass def clientGone(self, proto): pass class HTTPClientProtocol(LineReceiver, TimeoutMixin, object): """ A HTTP 1.1 Client with request pipelining support. """ chanRequest = None maxHeaderLength = 10240 firstLine = 1 readPersistent = PERSIST_NO_PIPELINE # inputTimeOut should be pending whenever a complete request has # been written but the complete response has not yet been # received, and be reset every time data is received. inputTimeOut = 60 * 4 def __init__(self, manager=None): """ @param manager: The object this client reports it state to. @type manager: L{IHTTPClientManager} """ self.outRequest = None self.inRequests = [] if manager is None: manager = EmptyHTTPClientManager() self.manager = manager def lineReceived(self, line): if not self.inRequests: # server sending random unrequested data. self.transport.loseConnection() return # If not currently writing this request, set timeout if self.inRequests[0] is not self.outRequest: self.setTimeout(self.inputTimeOut) if self.firstLine: self.firstLine = 0 self.inRequests[0].gotInitialLine(line) else: self.inRequests[0].lineReceived(line) def rawDataReceived(self, data): if not self.inRequests: # Server sending random unrequested data. self.transport.loseConnection() return # If not currently writing this request, set timeout if self.inRequests[0] is not self.outRequest: self.setTimeout(self.inputTimeOut) self.inRequests[0].rawDataReceived(data) def submitRequest(self, request, closeAfter=True): """ @param request: The request to send to a remote server. @type request: L{ClientRequest} @param closeAfter: If True the 'Connection: close' header will be sent, otherwise 'Connection: keep-alive' @type closeAfter: C{bool} @rtype: L{twisted.internet.defer.Deferred} @return: A Deferred which will be called back with the L{twisted.web2.http.Response} from the server. """ # Assert we're in a valid state to submit more assert self.outRequest is None assert ((self.readPersistent is PERSIST_NO_PIPELINE and not self.inRequests) or self.readPersistent is PERSIST_PIPELINE) self.manager.clientBusy(self) if closeAfter: self.readPersistent = False self.outRequest = chanRequest = HTTPClientChannelRequest(self, request, closeAfter) self.inRequests.append(chanRequest) chanRequest.submit() return chanRequest.responseDefer def requestWriteFinished(self, request): assert request is self.outRequest self.outRequest = None # Tell the manager if more requests can be submitted. self.setTimeout(self.inputTimeOut) if self.readPersistent is PERSIST_PIPELINE: self.manager.clientPipelining(self) def requestReadFinished(self, request): assert self.inRequests[0] is request del self.inRequests[0] self.firstLine = True if not self.inRequests: if self.readPersistent: self.setTimeout(None) self.manager.clientIdle(self) else: self.transport.loseConnection() def setReadPersistent(self, persist): self.readPersistent = persist if not persist: # Tell all requests but first to abort. for request in self.inRequests[1:]: request.connectionLost(None) del self.inRequests[1:] def connectionLost(self, reason): self.readPersistent = False self.setTimeout(None) self.manager.clientGone(self) # Tell all requests to abort. for request in self.inRequests: if request is not None: request.connectionLost(reason) TwistedWeb2-8.1.0/twisted/web2/client/interfaces.py0000644000175000017500000000243210624150303020625 0ustar dokodokofrom zope.interface import Interface class IHTTPClientManager(Interface): """I coordinate between multiple L{HTTPClientProtocol} objects connected to a single server to facilite request queuing and pipelining. """ def clientBusy(proto): """Called when the L{HTTPClientProtocol} doesn't want to accept anymore requests. @param proto: The L{HTTPClientProtocol} that is changing state. @type proto: L{HTTPClientProtocol} """ pass def clientIdle(proto): """Called when an L{HTTPClientProtocol} is able to accept more requests. @param proto: The L{HTTPClientProtocol} that is changing state. @type proto: L{HTTPClientProtocol} """ pass def clientPipelining(proto): """Called when the L{HTTPClientProtocol} determines that it is able to support request pipelining. @param proto: The L{HTTPClientProtocol} that is changing state. @type proto: L{HTTPClientProtocol} """ pass def clientGone(proto): """Called when the L{HTTPClientProtocol} disconnects from the server. @param proto: The L{HTTPClientProtocol} that is changing state. @type proto: L{HTTPClientProtocol} """ pass TwistedWeb2-8.1.0/twisted/web2/static.py0000644000175000017500000004550110634573544016540 0ustar dokodoko# Copyright (c) 2001-2004 Twisted Matrix Laboratories. # See LICENSE for details. """ I deal with static resources. """ # System Imports import os, time, stat import tempfile import md5 # Sibling Imports from twisted.web2 import http_headers, resource from twisted.web2 import http, iweb, stream, responsecode, server, dirlist # Twisted Imports from twisted.python import filepath from twisted.internet.defer import maybeDeferred from zope.interface import implements class MetaDataMixin(object): """ Mix-in class for L{iweb.IResource} which provides methods for accessing resource metadata specified by HTTP. """ def etag(self): """ @return: The current etag for the resource if available, None otherwise. """ return None def lastModified(self): """ @return: The last modified time of the resource if available, None otherwise. """ return None def creationDate(self): """ @return: The creation date of the resource if available, None otherwise. """ return None def contentLength(self): """ @return: The size in bytes of the resource if available, None otherwise. """ return None def contentType(self): """ @return: The MIME type of the resource if available, None otherwise. """ return None def contentEncoding(self): """ @return: The encoding of the resource if available, None otherwise. """ return None def displayName(self): """ @return: The display name of the resource if available, None otherwise. """ return None def exists(self): """ @return: True if the resource exists on the server, False otherwise. """ return True class StaticRenderMixin(resource.RenderMixin, MetaDataMixin): def checkPreconditions(self, request): # This code replaces the code in resource.RenderMixin if request.method not in ("GET", "HEAD"): http.checkPreconditions( request, entityExists = self.exists(), etag = self.etag(), lastModified = self.lastModified(), ) # Check per-method preconditions method = getattr(self, "preconditions_" + request.method, None) if method: return method(request) def renderHTTP(self, request): """ See L{resource.RenderMixIn.renderHTTP}. This implementation automatically sets some headers on the response based on data available from L{MetaDataMixin} methods. """ def setHeaders(response): response = iweb.IResponse(response) # Don't provide additional resource information to error responses if response.code < 400: # Content-* headers refer to the response content, not # (necessarily) to the resource content, so they depend on the # request method, and therefore can't be set here. for (header, value) in ( ("etag", self.etag()), ("last-modified", self.lastModified()), ): if value is not None: response.headers.setHeader(header, value) return response def onError(f): # If we get an HTTPError, run its response through setHeaders() as # well. f.trap(http.HTTPError) return setHeaders(f.value.response) d = maybeDeferred(super(StaticRenderMixin, self).renderHTTP, request) return d.addCallbacks(setHeaders, onError) class Data(resource.Resource): """ This is a static, in-memory resource. """ def __init__(self, data, type): self.data = data self.type = http_headers.MimeType.fromString(type) self.created_time = time.time() def etag(self): lastModified = self.lastModified() return http_headers.ETag("%X-%X" % (lastModified, hash(self.data)), weak=(time.time() - lastModified <= 1)) def lastModified(self): return self.creationDate() def creationDate(self): return self.created_time def contentLength(self): return len(self.data) def contentType(self): return self.type def render(self, req): return http.Response( responsecode.OK, http_headers.Headers({'content-type': self.contentType()}), stream=self.data) class File(StaticRenderMixin): """ File is a resource that represents a plain non-interpreted file (although it can look for an extension like .rpy or .cgi and hand the file to a processor for interpretation if you wish). Its constructor takes a file path. Alternatively, you can give a directory path to the constructor. In this case the resource will represent that directory, and its children will be files underneath that directory. This provides access to an entire filesystem tree with a single Resource. If you map the URL 'http://server/FILE' to a resource created as File('/tmp'), then http://server/FILE/ will return an HTML-formatted listing of the /tmp/ directory, and http://server/FILE/foo/bar.html will return the contents of /tmp/foo/bar.html . """ implements(iweb.IResource) def _getContentTypes(self): if not hasattr(File, "_sharedContentTypes"): File._sharedContentTypes = loadMimeTypes() return File._sharedContentTypes contentTypes = property(_getContentTypes) contentEncodings = { ".gz" : "gzip", ".bz2": "bzip2" } processors = {} indexNames = ["index", "index.html", "index.htm", "index.trp", "index.rpy"] type = None def __init__(self, path, defaultType="text/plain", ignoredExts=(), processors=None, indexNames=None): """Create a file with the given path. """ super(File, self).__init__() self.putChildren = {} self.fp = filepath.FilePath(path) # Remove the dots from the path to split self.defaultType = defaultType self.ignoredExts = list(ignoredExts) if processors is not None: self.processors = dict([ (key.lower(), value) for key, value in processors.items() ]) if indexNames is not None: self.indexNames = indexNames def exists(self): return self.fp.exists() def etag(self): if not self.fp.exists(): return None st = self.fp.statinfo # # Mark ETag as weak if it was modified more recently than we can # measure and report, as it could be modified again in that span # and we then wouldn't know to provide a new ETag. # weak = (time.time() - st.st_mtime <= 1) return http_headers.ETag( "%X-%X-%X" % (st.st_ino, st.st_size, st.st_mtime), weak=weak ) def lastModified(self): if self.fp.exists(): return self.fp.getmtime() else: return None def creationDate(self): if self.fp.exists(): return self.fp.getmtime() else: return None def contentLength(self): if self.fp.exists(): if self.fp.isfile(): return self.fp.getsize() else: # Computing this would require rendering the resource; let's # punt instead. return None else: return None def _initTypeAndEncoding(self): self._type, self._encoding = getTypeAndEncoding( self.fp.basename(), self.contentTypes, self.contentEncodings, self.defaultType ) # Handle cases not covered by getTypeAndEncoding() if self.fp.isdir(): self._type = "httpd/unix-directory" def contentType(self): if not hasattr(self, "_type"): self._initTypeAndEncoding() return http_headers.MimeType.fromString(self._type) def contentEncoding(self): if not hasattr(self, "_encoding"): self._initTypeAndEncoding() return self._encoding def displayName(self): if self.fp.exists(): return self.fp.basename() else: return None def ignoreExt(self, ext): """Ignore the given extension. Serve file.ext if file is requested """ self.ignoredExts.append(ext) def directoryListing(self): return dirlist.DirectoryLister(self.fp.path, self.listChildren(), self.contentTypes, self.contentEncodings, self.defaultType) def putChild(self, name, child): """ Register a child with the given name with this resource. @param name: the name of the child (a URI path segment) @param child: the child to register """ self.putChildren[name] = child def getChild(self, name): """ Look up a child resource. @return: the child of this resource with the given name. """ if name == "": return self child = self.putChildren.get(name, None) if child: return child child_fp = self.fp.child(name) if child_fp.exists(): return self.createSimilarFile(child_fp.path) else: return None def listChildren(self): """ @return: a sequence of the names of all known children of this resource. """ children = self.putChildren.keys() if self.fp.isdir(): children += [c for c in self.fp.listdir() if c not in children] return children def locateChild(self, req, segments): """ See L{IResource}C{.locateChild}. """ # If getChild() finds a child resource, return it child = self.getChild(segments[0]) if child is not None: return (child, segments[1:]) # If we're not backed by a directory, we have no children. # But check for existance first; we might be a collection resource # that the request wants created. self.fp.restat(False) if self.fp.exists() and not self.fp.isdir(): return (None, ()) # OK, we need to return a child corresponding to the first segment path = segments[0] if path: fpath = self.fp.child(path) else: # Request is for a directory (collection) resource return (self, server.StopTraversal) # Don't run processors on directories - if someone wants their own # customized directory rendering, subclass File instead. if fpath.isfile(): processor = self.processors.get(fpath.splitext()[1].lower()) if processor: return ( processor(fpath.path), segments[1:]) elif not fpath.exists(): sibling_fpath = fpath.siblingExtensionSearch(*self.ignoredExts) if sibling_fpath is not None: fpath = sibling_fpath return self.createSimilarFile(fpath.path), segments[1:] def renderHTTP(self, req): self.fp.restat(False) return super(File, self).renderHTTP(req) def render(self, req): """You know what you doing.""" if not self.fp.exists(): return responsecode.NOT_FOUND if self.fp.isdir(): if req.uri[-1] != "/": # Redirect to include trailing '/' in URI return http.RedirectResponse(req.unparseURL(path=req.path+'/')) else: ifp = self.fp.childSearchPreauth(*self.indexNames) if ifp: # Render from the index file standin = self.createSimilarFile(ifp.path) else: # Render from a DirectoryLister standin = dirlist.DirectoryLister( self.fp.path, self.listChildren(), self.contentTypes, self.contentEncodings, self.defaultType ) return standin.render(req) try: f = self.fp.open() except IOError, e: import errno if e[0] == errno.EACCES: return responsecode.FORBIDDEN elif e[0] == errno.ENOENT: return responsecode.NOT_FOUND else: raise response = http.Response() response.stream = stream.FileStream(f, 0, self.fp.getsize()) for (header, value) in ( ("content-type", self.contentType()), ("content-encoding", self.contentEncoding()), ): if value is not None: response.headers.setHeader(header, value) return response def createSimilarFile(self, path): return self.__class__(path, self.defaultType, self.ignoredExts, self.processors, self.indexNames[:]) class FileSaver(resource.PostableResource): allowedTypes = (http_headers.MimeType('text', 'plain'), http_headers.MimeType('text', 'html'), http_headers.MimeType('text', 'css')) def __init__(self, destination, expectedFields=[], allowedTypes=None, maxBytes=1000000, permissions=0644): self.destination = destination self.allowedTypes = allowedTypes or self.allowedTypes self.maxBytes = maxBytes self.expectedFields = expectedFields self.permissions = permissions def makeUniqueName(self, filename): """Called when a unique filename is needed. filename is the name of the file as given by the client. Returns the fully qualified path of the file to create. The file must not yet exist. """ return tempfile.mktemp(suffix=os.path.splitext(filename)[1], dir=self.destination) def isSafeToWrite(self, filename, mimetype, filestream): """Returns True if it's "safe" to write this file, otherwise it raises an exception. """ if filestream.length > self.maxBytes: raise IOError("%s: File exceeds maximum length (%d > %d)" % (filename, filestream.length, self.maxBytes)) if mimetype not in self.allowedTypes: raise IOError("%s: File type not allowed %s" % (filename, mimetype)) return True def writeFile(self, filename, mimetype, fileobject): """Does the I/O dirty work after it calls isSafeToWrite to make sure it's safe to write this file. """ filestream = stream.FileStream(fileobject) if self.isSafeToWrite(filename, mimetype, filestream): outname = self.makeUniqueName(filename) flags = os.O_WRONLY | os.O_CREAT | os.O_EXCL | getattr(os, "O_BINARY", 0) fileobject = os.fdopen(os.open(outname, flags, self.permissions), 'wb', 0) stream.readIntoFile(filestream, fileobject) return outname def render(self, req): content = [""] if req.files: for fieldName in req.files: if fieldName in self.expectedFields: for finfo in req.files[fieldName]: try: outname = self.writeFile(*finfo) content.append("Saved file %s
" % outname) except IOError, err: content.append(str(err) + "
") else: content.append("%s is not a valid field" % fieldName) else: content.append("No files given") content.append("") return http.Response(responsecode.OK, {}, stream='\n'.join(content)) # FIXME: hi there I am a broken class # """I contain AsIsProcessor, which serves files 'As Is' # Inspired by Apache's mod_asis # """ # # class ASISProcessor: # implements(iweb.IResource) # # def __init__(self, path): # self.path = path # # def renderHTTP(self, request): # request.startedWriting = 1 # return File(self.path) # # def locateChild(self, request): # return None, () ## # Utilities ## dangerousPathError = http.HTTPError(responsecode.NOT_FOUND) #"Invalid request URL." def isDangerous(path): return path == '..' or '/' in path or os.sep in path def addSlash(request): return "http%s://%s%s/" % ( request.isSecure() and 's' or '', request.getHeader("host"), (request.uri.split('?')[0])) def loadMimeTypes(mimetype_locations=['/etc/mime.types']): """ Multiple file locations containing mime-types can be passed as a list. The files will be sourced in that order, overriding mime-types from the files sourced beforehand, but only if a new entry explicitly overrides the current entry. """ import mimetypes # Grab Python's built-in mimetypes dictionary. contentTypes = mimetypes.types_map # Update Python's semi-erroneous dictionary with a few of the # usual suspects. contentTypes.update( { '.conf': 'text/plain', '.diff': 'text/plain', '.exe': 'application/x-executable', '.flac': 'audio/x-flac', '.java': 'text/plain', '.ogg': 'application/ogg', '.oz': 'text/x-oz', '.swf': 'application/x-shockwave-flash', '.tgz': 'application/x-gtar', '.wml': 'text/vnd.wap.wml', '.xul': 'application/vnd.mozilla.xul+xml', '.py': 'text/plain', '.patch': 'text/plain', } ) # Users can override these mime-types by loading them out configuration # files (this defaults to ['/etc/mime.types']). for location in mimetype_locations: if os.path.exists(location): contentTypes.update(mimetypes.read_mime_types(location)) return contentTypes def getTypeAndEncoding(filename, types, encodings, defaultType): p, ext = os.path.splitext(filename) ext = ext.lower() if encodings.has_key(ext): enc = encodings[ext] ext = os.path.splitext(p)[1].lower() else: enc = None type = types.get(ext, defaultType) return type, enc ## # Test code ## if __name__ == '__builtin__': # Running from twistd -y from twisted.application import service, strports from twisted.web2 import server res = File('/') application = service.Application("demo") s = strports.service('8080', server.Site(res)) s.setServiceParent(application) TwistedWeb2-8.1.0/twisted/plugins/0000755000175000017500000000000011017352661015503 5ustar dokodokoTwistedWeb2-8.1.0/twisted/plugins/twisted_web2.py0000644000175000017500000000161010776712352020465 0ustar dokodoko# Copyright (c) 2001-2008 Twisted Matrix Laboratories. # See LICENSE for details. from zope.interface import implements from twisted.plugin import IPlugin from twisted.web2.iweb import IResource class _Web2ResourcePlugin(object): implements(IPlugin, IResource) def __init__(self, name, className, description): self.name = name self.className = className self.description = description TestResource = _Web2ResourcePlugin("TestResource", "twisted.web2.plugin.TestResource", "I'm a test resource") from twisted.application.service import ServiceMaker TwistedWeb2 = ServiceMaker('Twisted Web2', 'twisted.web2.tap', ("An HTTP/1.1 web server that can serve from a " "filesystem or application resource."), "web2")