#11675 closed (fixed)
Support new memcached wrapper pylibmc
Reported by: | Owned by: | otherjacob | |
---|---|---|---|
Component: | Core (Cache system) | Version: | dev |
Severity: | Keywords: | cache pylibmc memcached.py | |
Cc: | danger@…, tomasz.elendt@…, nikitka@… | Triage Stage: | Accepted |
Has patch: | yes | Needs documentation: | no |
Needs tests: | no | Patch needs improvement: | yes |
Easy pickings: | no | UI/UX: | no |
Description
pylibmc is a python wrapper around the libmemcache C library.
According to the site, it is faster than cmemcache.
The API is pretty much python-memcached, so I added the fallback: pylibmc -> cmemcache -> memcache.
The patch contains more about eternal caching.
This is my first patch. Thank you.
Attachments (2)
Change History (23)
by , 15 years ago
Attachment: | memcached.py.diff added |
---|
comment:1 by , 15 years ago
comment:2 by , 15 years ago
Triage Stage: | Unreviewed → Accepted |
---|
I'm marking this accepted, but I don't think implementing *another* fallback is the right way to go. I think the Memcached backend should just have a class "connection_class" attribute and then people can subclass the backend to add support for new memcached wrappers.
comment:3 by , 15 years ago
This is actually a wrapper around libmemcached: http://tangent.org/552/libmemcached.html
It exposes a lot more functionality than the other ones. Just to name a few: set_multi(), delete_multi(), flush_all(), prepend(), append().
comment:4 by , 15 years ago
Has patch: | unset |
---|---|
Owner: | changed from | to
Status: | new → assigned |
I'm going to go ahead and adopt Alex's suggestion for future library support.
comment:5 by , 15 years ago
So rather than add other fallback we were thinking that one could specify the preferred memcached backend as follows:
CACHE_BACKEND = memcached://server:port?lib=memcache
Where memcache could be cmemcache, memcache, or some other python module. If an engine is not specified, it will default to the current behavior of cmemcache then memcache fallback.
comment:6 by , 15 years ago
Triage Stage: | Accepted → Design decision needed |
---|
Erm... there's a test, but no actual fix?
Also - I'm not completely sold on either approach. Specifying a module name in the CACHE_BACKEND setting is a neat idea - provided there is some sort of guarantee that the API for a python memcache backend is always the same. From first inspection, this isn't true (set_many vs set_multi). Subclassing seems like overkill, given that there is a limited number of support classes out there.
Another option that hasn't been floated is to allow pylibmc:// definitions that point to a cache.memcache.PyLibMemcache implementation; that is, we do the subclassing, so end users don't have to.
Also - given that cmemcache is clearly deprecated, so we should also deprecate it's use.
comment:8 by , 15 years ago
milestone: | → 1.2 |
---|---|
Triage Stage: | Design decision needed → Accepted |
Putting back to 1.2; there has been some discussion about including a partial fix in 1.2 that would start the deprecation process for cmemcache. That won't close this ticket, but it should be addressed before 1.2 final.
To that end, the mailing list discussion also resolved the design decision: we're going to add a new backend for pylibmc.
comment:9 by , 15 years ago
milestone: | 1.2 → 1.3 |
---|
Deferring this ticket to 1.3; #12427 is tracking the deprecation of cmemcache.
comment:10 by , 15 years ago
Those who need pylibmc
support right now can have a look at http://gist.github.com/334682.
comment:11 by , 15 years ago
Will the pylibmc (well, libmemcached) behaviors be exposed? These are very powerful and a game changer in many cases. To mention some of their benefits:
- consistent hashing
- switch hash algo to remain compatible with other applications/libraries
- tweak tcp such as nodelay for a much welcome speed increase
follow-up: 13 comment:12 by , 15 years ago
Hey guys, just a quick comment from my experiences switching to pylibmc on a (non-django) project I'm working on. The transition was pretty smooth, the only thing I noticed that was slightly different in terms of implementation is that pylibmc raises pylibmc.NotFound if you incr/decr a key that doesn't exist while the other Python clients raise ValueError. I'm pretty sure this will break the Django memcache backend for incr/decr operations.
comment:13 by , 15 years ago
Replying to mmalone:
pylibmc raises pylibmc.NotFound if you incr/decr a key that doesn't exist while the other Python clients raise ValueError. I'm pretty sure this will break the Django memcache backend for incr/decr operations.
If necessary, it should be easy enough to catch that and then re-raise a ValueError to be consistent. Django certainly does that to maintain consistency in other places (such as for errors raised by different db backends).
comment:14 by , 15 years ago
Cc: | added |
---|---|
milestone: | 1.3 → 2.0 |
comment:15 by , 15 years ago
milestone: | 2.0 → 1.3 |
---|
comment:16 by , 14 years ago
Cc: | added |
---|
comment:17 by , 14 years ago
1.1.1 not work with mod_wsgi (fixed in https://github.com/lericson/pylibmc/commit/ddd2f011f73d8ccc6347c5471eff378bef58dbd5)
comment:18 by , 14 years ago
Cc: | added |
---|
comment:19 by , 14 years ago
Has patch: | set |
---|---|
Patch needs improvement: | set |
Summary: | [patch] Support new memcached wrapper pylibmc → Support new memcached wrapper pylibmc |
A combined patch with the change and tests is preferred.
comment:20 by , 14 years ago
Resolution: | → fixed |
---|---|
Status: | assigned → closed |
(In [15005]) Fixed #11675 -- Added support for the PyLibMC cache library. In order to support this, and clean up some other 1.3 caching additions, this patch also includes some changes to the way caches are defined. This means you can now have multiple caches, in the same way you have multiple databases. A huge thanks to Jacob Burch for the work on the PyLibMC backend, and to Jannis for his work on the cache definition changes.
Sorry, attachment is mine.