Ticket #6342: patch_removewww.patch
File patch_removewww.patch, 2.3 KB (added by , 17 years ago) |
---|
-
django/middleware/common.py
old new 13 13 14 14 - Forbids access to User-Agents in settings.DISALLOWED_USER_AGENTS 15 15 16 - URL rewriting: Based on the APPEND_SLASH and PREPEND_WWW settings,17 this middleware appends missing slashes and/or prepends missing18 16 - URL rewriting: Based on the APPEND_SLASH, PREPEND_WWW and REMOVE_WWW 17 settings this middleware appends missing slashes and/or removes/prepends 18 "www."s. 19 19 20 20 - If APPEND_SLASH is set and the initial URL doesn't end with a 21 21 slash, and it is not found in urlpatterns, a new URL is formed by … … 40 40 if user_agent_regex.search(request.META['HTTP_USER_AGENT']): 41 41 return http.HttpResponseForbidden('<h1>Forbidden</h1>') 42 42 43 # Check for a redirect based on settings.APPEND_SLASH 44 # and settings.PREPEND_WWW43 # Check for a redirect based on settings.APPEND_SLASH, 44 # settings.PREPEND_WWW, and settings.REMOVE_WWW 45 45 host = request.get_host() 46 46 old_url = [host, request.path] 47 47 new_url = old_url[:] … … 50 50 not old_url[0].startswith('www.')): 51 51 new_url[0] = 'www.' + old_url[0] 52 52 53 # Remove the "www." subdomain from the url if REMOVE_WWW is set and 54 # the url starts with "www." 55 if (settings.REMOVE_WWW and old_url[0] and 56 old_url[0].startswith('www.')): 57 new_url[0] = new_url[0][4:] 58 59 53 60 # Append a slash if APPEND_SLASH is set and the URL doesn't have a 54 61 # trailing slash and there is no pattern for the current path 55 62 if settings.APPEND_SLASH and (not old_url[1].endswith('/')): -
django/conf/global_settings.py
old new 179 179 # Whether to prepend the "www." subdomain to URLs that don't have it. 180 180 PREPEND_WWW = False 181 181 182 # Whether to remove the "www." subdomain from URLs that have it. 183 REMOVE_WWW = False 184 182 185 # List of compiled regular expression objects representing User-Agent strings 183 186 # that are not allowed to visit any page, systemwide. Use this for bad 184 187 # robots/crawlers. Here are a few examples: