about summary refs log tree commit diff homepage
path: root/blog
diff options
context:
space:
mode:
authorNguyễn Gia Phong <mcsinyx@disroot.org>2021-03-09 22:24:01 +0700
committerNguyễn Gia Phong <mcsinyx@disroot.org>2021-03-09 22:24:01 +0700
commitff37459c4b83966a604a46769a8b384d07b28492 (patch)
treed76d623e8ed2c67d2f23df5e85e0f67be942faa5 /blog
parent1ff1746272a97d9c58d2e6a8936592f90fd5cd47 (diff)
downloadsite-ff37459c4b83966a604a46769a8b384d07b28492.tar.gz
Migrate the rest of GSoC'20 blogs
Diffstat (limited to 'blog')
-rw-r--r--blog/gsoc2020/blog20200609.md112
-rw-r--r--blog/gsoc2020/blog20200622.md113
-rw-r--r--blog/gsoc2020/blog20200706.md78
-rw-r--r--blog/gsoc2020/blog20200720.md84
-rw-r--r--blog/gsoc2020/blog20200803.md46
-rw-r--r--blog/gsoc2020/blog20200817.md52
-rw-r--r--blog/gsoc2020/blog20200831.md109
7 files changed, 594 insertions, 0 deletions
diff --git a/blog/gsoc2020/blog20200609.md b/blog/gsoc2020/blog20200609.md
new file mode 100644
index 0000000..b0e6a7b
--- /dev/null
+++ b/blog/gsoc2020/blog20200609.md
@@ -0,0 +1,112 @@
++++
+rss = "GSoC 2020: Unexpected Things When You're Expecting"
+date = Date(2020, 6, 9)
++++
+@def tags = ["pip", "gsoc"]
+
+# Unexpected Things When You're Expecting
+
+Hi everyone, I hope that you are all doing well and wishes you all good health!
+The last week has not been really kind to me with a decent amount of
+academic pressure (my school year is lasting until early Jully).
+It would be bold to say that I have spent 10 hours working on my GSoC project
+since the last check-in, let alone the 30 hours per week requirement.
+That being said, there were still some discoveries that I wish to share.
+
+\toc
+
+## The `multiprocessing[.dummy]` wrapper
+
+Most of the time I spent was to finalize the multi{processing,threading}
+wrapper for `map` function that submit tasks to the worker pool.
+To my surprise, it is rather difficult to write something that is
+not only portable but also easy to read and test.
+
+By {{pip 8320 "the latest commit"}}, I realized the following:
+
+1. The `multiprocessing` module was not designed for the implementation
+   details to be abstracted away entirely.  For example, the lazy `map`'s
+   could be really slow without specifying suitable chunk size
+   (to cut the input iterable and distribute them to workers in the pool).
+   By *suitable*, I mean only an order smaller than the input.  This defeats
+   half of the purpose of making it lazy: allowing the input to be
+   evaluated lazily.  Luckily, in the use case I'm aiming for, the length of
+   the iterable argument is small and the laziness is only needed for the output
+   (to pipeline download and installation).
+2. Mocking `import` for testing purposes can never be pretty.  One reason
+   is that we (Python users) have very little control over the calls of
+   `import` statements and its lower-level implementation `__import__`.
+   In order to properly patch this built-in function, unlike for others
+   of the same group, we have to `monkeypatch` the name from `builtins`
+   (or `__builtins__` under Python 2) instead of the module that import stuff.
+   Furthermore, because of the special namespacing, to avoid infinite recursion
+   we need to alias the function to a different name for fallback.
+3. To add to the problem, `multiprocessing` lazily imports the fragile module
+   during pools creation.  Since the failure is platform-specific
+   (the lack of `sem_open`), it was decided to check upon the import
+   of the `pip`'s module.  Although the behavior is easier to reason
+   in human language, testing it requires invalidating cached import and
+   re-import the wrapper module.
+4. Last but not least, I now understand the pain of keeping Python 2
+   compatibility that many package maintainers still need to deal with
+   everyday (although Python 2 has reached its end-of-life, `pip`, for
+   example, {{pip 6148 "will still support it for another year"}}).
+
+## The change in direction
+
+Since last week, my mentor Pradyun Gedam and I set up weekly real-time
+meeting (a fancy term for video/audio chat in the worldwide quarantine
+era) for the entire GSoC period. During the last session, we decided to
+put parallelization of download during resolution on hold, in favor of a
+more beneficial goal: {{pip 7819 "partially download the wheels during
+dependency resolution"}}.
+
+![](/assets/swirl.png)
+
+As discussed by Danny McClanahan and the maintainers of `pip`, it is feasible
+to only download a few kB of a wheel to obtain enough metadata for
+the resolution of dependency.  While this is only applicable to wheels
+(i.e. prebuilt packages), other packaging format only make up less than 20%
+of the downloads (at least on PyPI), and the figure is much less for
+the most popular packages.  Therefore, this optimization alone could make
+[the upcoming backtracking resolver][]'s performance par with the legacy one.
+
+During the last few years, there has been a lot of effort being poured into
+replacing `pip`'s current resolver that is unable to resolve conflicts.
+While its correctness will be ensured by some of the most talented and
+hard-working developers in the Python packaging community, from the users'
+point of view, it would be better to have its performance not lagging
+behind the old one.  Aside from the increase in CPU cycles for more
+rigorous resolution, more I/O, especially networking operations is expected
+to be performed.  This is due to {{pip 7406#issuecomment-583891169 "the lack
+of a standard and efficient way to acquire the metadata"}}.  Therefore, unlike
+most package managers we are familiar with, `pip` has to fetch
+(and possibly build) the packages solely for dependency informations.
+
+Fortunately, {{pep 427 recommended-archiver-features}} recommends
+package builders to place the metadata at the end of the archive.
+This allows the resolver to only fetch the last few kB using
+`HTTP range requests`_ for the relevant information.
+Simply appending `Range: bytes=-8000` to the request header
+in `pip._internal.network.download` makes the resolution process
+*lightning* fast.  Of course this breaks the installation but I am confident
+that it is not difficult to implement this optimization cleanly.
+
+One drawback of this optimization is the compatibility.  Not every Python
+package index support range requests, and it is not possible to verify
+the partial wheel.  While the first case is unavoidable, for the other,
+hashes checking is usually used for pinned/locked-version requirements,
+thus no backtracking is done during dependency resolution.
+
+Either way, before installation, the packages selected by the resolver
+can be downloaded in parallel.  This warranties a larger crowd of packages,
+compared to parallelization during resolution, where the number of downloads
+can be as low as one during trail of different versions of the same package.
+
+Unfortunately, I have not been able to do much other than
+{{pip 8411 "a minor clean up"}}.  I am looking forward to accomplishing more
+this week and seeing what this path will lead us too!  At the moment,
+I am happy that I'm able to meet the blog deadline, at least in UTC!
+
+[the upcoming backtracking resolver]: http://www.ei8fdb.org/thoughts/2020/05/test-pips-alpha-resolver-and-help-us-document-dependency-conflicts
+[HTTP range requests]: https://developer.mozilla.org/en-US/docs/Web/HTTP/Range_requests
diff --git a/blog/gsoc2020/blog20200622.md b/blog/gsoc2020/blog20200622.md
new file mode 100644
index 0000000..3bb3a2c
--- /dev/null
+++ b/blog/gsoc2020/blog20200622.md
@@ -0,0 +1,113 @@
++++
+rss = "GSoC 2020: The Wonderful Wizard of O'zip"
+date = Date(2020, 6, 22)
++++
+@def tags = ["pip", "gsoc"]
+
+# The Wonderful Wizard of O'zip
+
+> Never give up... No one knows what's going to happen next.
+
+\toc
+
+## Preface
+
+Greetings and best wishes!  I had a lot of fun during the last week,
+although admittedly nothing was really finished.  In summary,
+these are the works I carried out in the last seven days:
+
+* Finilizing {{pip 8320 "utilities for parallelization"}}
+* {{pip 8467 "Continuing experimenting"}}
+  on {{pip 8442 "using lazy wheels or dependency resolution"}}
+* Polishing up {{pip 8411 "the patch"}} refactoring
+  `operations.prepare.prepare_linked_requirement`
+* Adding `flake8-logging-format`
+  {{pip 8423#issuecomment-645418725 "to the linter"}}
+* Splitting {{pip 8456 "the linting patch"}} from {{pip 8332 "the PR adding
+  the license requirement to vendor README"}}
+
+## The `multiprocessing[.dummy]` wrapper
+
+Yes, you read it right, this is the same section as last fortnight's blog.
+My mentor Pradyun Gedam gave me a green light to have {{pip 8411}} merged
+without support for Python 2 and the non-lazy map variant, which turns out
+to be troublesome for multithreading.
+
+The tests still needs to pass of course and the flaky tests (see failing tests
+over Azure Pipeline in the past) really gave me a panic attack earlier today.
+We probably need to mark them as xfail or investigate why they are
+undeterministic specifically on Azure, but the real reason I was *all caught up
+and confused* was that the unit tests I added mess with the cached imports
+and as `pip`'s tests are run in parallel, who knows what it might affect.
+I was so relieved to not discover any new set of tests made flaky by ones
+I'm trying to add!
+
+## The file-like object mapping ZIP over HTTP
+
+This is where the fun starts.  Before we dive in, let's recall some
+background information on this.  As discovered by Danny McClanahan
+in {{pip 7819}}, it is possible to only download a potion of a wheel
+and it's still valid for `pip` to get the distribution's metadata.
+In the same thread, Daniel Holth suggested that one may use
+HTTP range requests to specifically ask for the tail of the wheel,
+where the ZIP's central directory record as well as where usually
+`dist-info` (the directory containing `METADATA`) can be found.
+
+Well, *usually*.  While {{pep 427}} does indeed recommend
+
+> Archivers are encouraged to place the `.dist-info` files physically
+> at the end of the archive.  This enables some potentially interesting
+> ZIP tricks including the ability to amend the metadata without
+> rewriting the entire archive.
+
+one of the mentioned *tricks* is adding shared libraries to wheels
+of extension modules (using e.g. `auditwheel` or `delocate`).
+Thus for non-pure Python wheels, it is unlikely that the metadata
+lie in the last few megabytes.  Ignoring source distributions is bad enough,
+we can't afford making an optimization that doesn't work for extension modules,
+which are still an integral part of the Python ecosystem )-:
+
+But hey, the ZIP's directory record is warrantied to be at the end of the file!
+Couldn't we do something about that?  The short answer is yes.  The long answer
+is, well, yessssssss! That, plus magic provided by most operating systems,
+this is what we figured out:
+
+1. We can download a realatively small chunk at the end of the wheel
+   until it is recognizable as a valid ZIP file.
+2. In order for the end of the archive to actually appear as the end to
+   `zipfile`, we feed to it an object with `seek` and `read` defined.
+   As navigating to the rear of the file is performed by calling `seek`
+   with relative offset and `whence=SEEK_END` (see `man 3 fseek`
+   for more details), we are completely able to make the wheels in the cloud
+   to behave as if it were available locally.
+
+   ![Wheel in the cloud](/assets/cloud.gif)
+
+3. For large wheels, it is better to store them in hard disks instead of memory.
+   For smaller ones, it is also preferable to store it as a file to avoid
+   (error-prony and often not really efficient) manual tracking and joining
+   of downloaded segments.  We only use a small potion of the wheel, however
+   just in case one is wonderring, we have very little control over
+   when `tempfile.SpooledTemporaryFile` rolls over, so the memory-disk hybrid
+   is not exactly working as expected.
+4. With all these in mind, all we have to do is to define an intermediate object
+   check for local availability and download if needed on calls to `read`,
+   to lazily provide the data over HTTP and reduce execution time.
+
+The only theoretical challenge left is to keep track of downloaded intervals,
+which I finally figured out after a few trials and errors.  The code
+was submitted as a pull request to `pip` at {{pip 8467}}.  A more modern
+(read: Python 3-only) variant was packaged and uploaded to PyPI under
+the name of lazip_.  I am unaware of any use case for it outside of `pip`,
+but it's certainly fun to play with d-:
+
+## What's next?
+
+I have been falling short of getting the PRs mention above merged for
+quite a while.  With `pip`'s next beta coming really soon, I have to somehow
+make the patches reach a certain standard and enough attention to be part of
+the pre-release—beta-testing would greatly help the success of the GSoC project.
+To other GSoC students and mentors reading this, I also hope your projects
+to turn out successful!
+
+[lazip]: https://pypi.org/project/lazip/
diff --git a/blog/gsoc2020/blog20200706.md b/blog/gsoc2020/blog20200706.md
new file mode 100644
index 0000000..9c41b31
--- /dev/null
+++ b/blog/gsoc2020/blog20200706.md
@@ -0,0 +1,78 @@
++++
+rss = "GSoC 2020: I'm Not Drowning On My Own"
+date = Date(2020, 7, 6)
++++
+@def tags = ["pip", "gsoc"]
+
+# I'm Not Drowning On My Own
+
+\toc
+
+## Cold Water
+
+Hello there!  My schoolyear is coming to an end, with some final assignments
+and group projects left to be done.  I for sure underestimated the workload
+of these and in the last (and probably next) few days I'm drowning in work
+trying to meet my deadlines.
+
+One project that might be remotely relevant is [cheese-shop][], which tries to
+manage the metadata of packages from the real [Cheese Shop][].  Other than that,
+schoolwork is draining a lot of my time and I can't remember the last time
+I came up with something new for my GSoC project )-;
+
+## Warm Water
+
+On the bright side, I received a lot of help and encouragement
+from contributors and stakeholders of `pip`.  In the last week alone,
+I had five pull requests merged:
+
+* {{pip 8332}}: Add license requirement to `_vendor/README.rst`
+* {{pip 8320}}: Add utilities for parallelization
+* {{pip 8504}}: Parallelize `pip list --outdated` and `--uptodate`
+* {{pip 8411}}: Refactor `operations.prepare.prepare_linked_requirement`
+* {{pip 8467}}: Add utitlity to lazily acquire wheel metadata over HTTP
+
+In addition to helping me getting my PRs merged, my mentor Pradyun Gedam
+also gave me my first official feedback, including what I'm doing right
+(and wrong too!) and what I should keep doing to increase the chance of
+the project being successful.
+
+{{pip 7819}}'s roadmap (Danny McClanahan's discoveries and works on lazy wheels)
+is being closely tracked by `hatch`'s maintainter Ofek Lev, which really
+makes me proud and warms my heart, that what I'm helping build is actually
+needed by the community!
+
+## Learning How To Swim
+
+With {{pip 8467}} and {{pip 8530}} merged, I'm now working on {{pip 8532}}
+which aims to roll out the lazy wheel as the way to obtain
+dependency information via the CLI flag `--use-feature=lazy-wheel`.
+
+{{pip 8532}} was failing initially, despite being relatively trivial and that
+the commit it used to base on was passing.  Surprisingly, after rebasing it
+on top of {{pip 8530}}, it suddenly became green mysteriously.  After the first
+(early) review, I was able to iterate on my earlier code, which used
+the ambiguous exception `RuntimeError`.
+
+The rest to be done is *just* adding some functional tests (I'm pretty sure
+this will be either overwhelming or underwhelming) to make sure that
+the command-line flag is working correctly.  Hopefully this can make it into
+the beta of the upcoming release {{pip 8511 "this month"}}.
+
+![Lazy wheel](/assets/lazy-wheel.jpg)
+
+In other news, I've also submitted {{pip 8538 "a patch improving the tests
+for the parallelization utilities"}}, which was really messy as I wrote them.
+Better late than never!
+
+Metaphors aside, I actually can't swim d-:
+
+## Diving Plan
+
+After {{pip 8532}}, I think I'll try to parallelize downloads of wheels
+that are lazily fetched only for metadata.  By the current implementation
+of the new resolver, for `pip install`, this can be injected directly
+between the resolution and build/installation process.
+
+[cheese-shop]: https://github.com/McSinyx/cheese-shop
+[Cheese Shop]: https://pypi.org
diff --git a/blog/gsoc2020/blog20200720.md b/blog/gsoc2020/blog20200720.md
new file mode 100644
index 0000000..43738a7
--- /dev/null
+++ b/blog/gsoc2020/blog20200720.md
@@ -0,0 +1,84 @@
++++
+rss = "GSoC 2020: I've Walked 500 Miles..."
+date = Date(2020, 7, 20)
++++
+@def tags = ["pip", "gsoc"]
+
+# I've Walked 500 Miles...
+
+> ... and I would walk 500 more\
+> Just to be the man who walks a thousand miles\
+> To fall down at your door
+>
+> ![500 miles](/assets/500-miles.gif)
+
+\toc
+
+## The Main Road
+
+Hi, have you met `fast-deps`?  It's (going to be) the name of `pip`'s
+experimental feature that may improve the speed of dependency resolution
+of the new resolver.  By avoid downloading whole wheels to just
+obtain metadata, it is especially helpful when `pip` has to do
+heavy backtracking to resolve conflicts.
+
+Thanks to {{pip 8532#discussion_r453990728 "Chris Hunt's review on GH-8537"}},
+my mentor Pradyun Gedam and I worked out a less hacky approach to inteject
+the call to lazy wheel during the resolution process.  A new PR {{pip 8588}}
+was filed to implement it—I could have *just* worked on top of the old PR
+and rebased, but my `git` skill is far from gud enuff to confidently do it.
+
+Testing this one has been a lot of fun though.  At first, integration tests
+were added as a rerun of the tests for the new resolver, with an additional flag
+to use feature `fast-deps`.  It indeed made me feel guilty towards [Travis][],
+who has to work around 30 minutes more every run. Per Chris Hunt's suggestion,
+in the new PR, I instead write a few functional tests for the area relating
+the most to the feature, namely `pip`'s subcommands `wheel`,
+`download` and `install`.
+
+It was also suggested that a mock server with HTTP range requests support
+might be better (in term of performance and reliablilty) than for testing.
+However, {{pip 8584#issuecomment-659227702 "I have yet to be able to make
+Werkzeug do it"}}.
+
+Why did I say I'm half way there?  With the parallel utilities merged and a way
+to quickly get the list of distribution to be downloaded being really close,
+what left is *only* to figure out a way to properly download them in parallel.
+With no distribution to be added during the download progress, the model of this
+will fit very well with the architecture in [my original proposal][].
+A batch downloader can be implemented to track the progress of each download
+and thus report them cleanly as e.g. progress bar or percentage. This is
+the part I am second-most excited about of my GSoC project this summer
+(after the synchronization of downloads written in my proposal, which was then
+superseded by `fast-deps`) and I can't wait to do it!
+
+## The Side Quests
+
+As usual, I make sure that I complete every side quest I see during the journey:
+
+* {{pip 8568}}: Declare constants in `configuration.py` as such
+* {{pip 8571}}: Clean up `Configuration.unset_value`
+  and nit the class' `__init__`
+* {{pip 8578}}: Allow verbose/quite level
+  to be specified via config file and env var
+* {{pip 8599}}: Replace tabs by spaces for consistency
+
+## Snap Back to Reality
+
+A bit about me, I actually walked 500 meters earlier today to a bank
+and walked 500 more to another to prepare my Visa card for purchasing
+the upcoming Pinephone prototype.  It's one of the first smartphones
+to fully support a GNU/Linux distribution, where one can run desktop apps
+(including proper terminals) as well as traditional services like SSH,
+HTTP server and IPFS node because why not?  Just a few hours ago,
+I pre-ordered the [postmarketOS community edition][] with additional hardware
+for convergence.
+
+If you did not come here for a Pinephone ad, please take my apologies though d-;
+and to ones reading this, I hope you all can become the person who walks
+a thousand miles to fall down at the door opening to all
+what you ever wished for!
+
+[Travis]: https://travis-ci.com
+[my original proposal]: /assets/pip-parallel-dl.pdf
+[postmarketOS community edition]: https://postmarketos.org/blog/2020/07/15/pinephone-ce-preorder/
diff --git a/blog/gsoc2020/blog20200803.md b/blog/gsoc2020/blog20200803.md
new file mode 100644
index 0000000..de2ef8d
--- /dev/null
+++ b/blog/gsoc2020/blog20200803.md
@@ -0,0 +1,46 @@
++++
+rss = "GSoC 2020: Sorting Things Out"
+date = Date(2020, 8, 3)
++++
+@def tags = ["pip", "gsoc"]
+
+# Sorting Things Out
+
+Hi!  I really hope that everyone reading this is still doing okay,
+and if that isn't the case, I wish you a good day!
+
+## `pip` 20.2 Released!
+
+Last Wednesday, `pip` 20.2 was released, delivering the `2020-resolver`
+as well as many other improvements!  I was lucky to be able
+to get the `fast-deps` feature to be included as part of the release.
+A brief description of this *experimental* feature as well as testing
+instruction can be found on [Python Discuss][].
+
+The public exposure of the feature also remind me of some further
+{{pip 8681 optimization}} to make on {{pip 8670 "the lazy wheel"}}.
+Hopefully without download parallelization it would not be too slow
+to put off testing by concerned users of `pip`.
+
+## Preparation for Download Parallelization
+
+As of this moment, we already have:
+
+* {{pip 8162#issuecomment-667504162 "Multithreading pool fallback working"}}
+* An opt-in to use lazy wheel to optain dependency information,
+  and thus getting a list of wheels at the end of resolution
+  ready to be downloaded together
+
+What's left is *only* to interject a parallel download somewhere after
+the dependency resolution step.  Still, this struggles me way more than
+I've ever imagined.  I got so stuck that I had to give myself a day off
+in the middle of the week (and study some Rust), then I came up with
+{{pip 8638 "something what was agreed upon as difficult to maintain"}}.
+
+Indeed, a large part of this is my fault, for not communicating the design
+thoroughly with `pip`'s maintainers and not carefully noting stuff down
+during (verbal) discussions with my mentor.  Thankfully {{pip 8685
+"Chris Hunt came to the rescue"}} and did a refactoring that will
+make my future work much easier and cleaner.
+
+[Python Discuss]: https://discuss.python.org/t/announcement-pip-20-2-release/4863/2
diff --git a/blog/gsoc2020/blog20200817.md b/blog/gsoc2020/blog20200817.md
new file mode 100644
index 0000000..40caad5
--- /dev/null
+++ b/blog/gsoc2020/blog20200817.md
@@ -0,0 +1,52 @@
++++
+rss = "GSoC 2020: Parallelizing Wheel Downloads"
+date = Date(2020, 8, 17)
++++
+@def tags = ["pip", "gsoc"]
+
+# Parallelizing Wheel Downloads
+
+> And now it's clear as this promise\
+> That we're making\
+> Two progress bars into one
+
+\toc
+
+Hello there! It has been raining a lot lately and some mosquito has given me
+the Dengue fever today.  To whoever reading this, I hope it would never happen
+to you.
+
+Download Parallelization
+------------------------
+
+I've been working on `pip`'s download parallelization for quite a while now.
+As distribution download in `pip` was modeled as a lazily evaluated iterable
+of chunks, parallelizing such procedure is as simple as submitting routines
+that write files to disk to a worker pool.
+
+Or at least that is what I thought.
+
+Progress Reporting UI
+---------------------
+
+`pip` is currently using customly defined progress reporting classes,
+which was not designed to working with multithreading code.  Firstly, I want to
+try using these instead of defining separate UI for multithreaded progresses.
+As they use system signals for termination, one must the progress bars has to be
+running the main thread.  Or sort of.
+
+Since the progress bars are designed as iterators, I realized that we
+can call `next` on them.  So quickly, I throw in some queues and locks,
+and prototyped the first *working* {{pip 8771 "implementation of
+progress synchronization"}}.
+
+Performance Issues
+------------------
+
+Welp, I only said that it works, but I didn't mention the performance,
+which is terrible.  I am pretty sure that the slow down is with
+the synchronization, since the `map_multithread` call doesn't seem
+to trigger anything that may introduce any sort of blocking.
+
+This seems like a lot of fun, and I hope I'll get better tomorrow
+to continue playing with it!
diff --git a/blog/gsoc2020/blog20200831.md b/blog/gsoc2020/blog20200831.md
new file mode 100644
index 0000000..eea0ead
--- /dev/null
+++ b/blog/gsoc2020/blog20200831.md
@@ -0,0 +1,109 @@
++++
+rss = "GSoC 2020: Outro"
+date = Date(2020, 8, 31)
++++
+@def tags = ["pip", "gsoc"]
+
+# Outro
+
+> Steamed fish was amazing, matter of fact\
+> Let me get some jerk chicken to go\
+> Grabbed me one of them lemon pie theories\
+> And let me get some of them benchmarks you theories too
+
+\toc
+
+## The Look
+
+At the time of writing,
+{{pip 8771 "implementation-wise parallel download is ready"}}:
+
+[![asciicast](/assets/pip-8771.svg)](https://asciinema.org/a/356704)
+
+Does this mean I've finished everything just-in-time?  This sounds to good
+to be true!  And how does it perform?  Welp...
+
+## The Benchmark
+
+Here comes the bad news: under a decent connection to the package index,
+using `fast-deps` does not make `pip` faster.  For best comparison,
+I will time `pip download` on the following cases:
+
+### Average Distribution
+
+For convenience purposes, let's refer to the commands to be used as follows
+
+```console
+$ pip --no-cache-dir download {requirement}  # legacy-resolver
+$ pip --use-feature=2020-resolver \
+   --no-cache-dir download {requirement}  # 2020-resolver
+$ pip --use-feature=2020-resolver --use-feature=fast-deps \
+   --no-cache-dir download {requirement}  # fast-deps
+```
+
+In the first test, I used [axuy][] and obtained the following results
+
+| legacy-resolver | 2020-resolver | fast-deps |
+| --------------- | ------------- | --------- |
+| 7.709s          | 7.888s        | 10.993s   |
+| 7.068s          | 7.127s        | 11.103s   |
+| 8.556s          | 6.972s        | 10.496s   |
+
+Funny enough, running `pip download` with `fast-deps` in a directory
+with downloaded files already took around 7-8 seconds.  This is because
+to lazily download a wheel, `pip` has to {{pip 8670 "make many requests"}}
+which are apparently more expensive than actual data transmission on my network.
+
+@@colbox-blue
+With unstable connection to PyPI (for some reason I am not confident enough
+to state), this is what I got
+
+| 2020-resolver | fast-deps |
+| ------------- | --------- |
+| 1m16.134s     | 0m54.894s |
+| 1m0.384s      | 0m40.753s |
+| 0m50.102s     | 0m41.988s |
+
+As the connection was *unstable* and that the majority of `pip` networking
+is performed as CI/CD with large and stable bandwidth, I am unsure what this
+result is supposed to tell (-;
+@@
+
+### Large Distribution
+
+In this test, I used [TensorFlow][] as the requirement and obtained
+the following figures:
+
+| legacy-resolver | 2020-resolver | fast-deps |
+| --------------- | ------------- | --------- |
+| 0m52.135s       | 0m58.809s     | 1m5.649s  |
+| 0m50.641s       | 1m14.896s     | 1m28.168s |
+| 0m49.691s       | 1m5.633s      | 1m22.131s |
+
+### Distribution with Conflicting Dependencies
+
+Some requirement that will trigger a decent amount of backtracking by
+the current implementation of the new resolver `oslo-utils==1.4.0`:
+
+| 2020-resolver | fast-deps |
+| ------------- | --------- |
+| 14.497s       | 24.010s   |
+| 17.680s       | 28.884s   |
+| 16.541s       | 26.333s   |
+
+## What Now?
+
+I don't know, to be honest.  At this point I'm feeling I've failed my own
+(and that of other stakeholders of `pip`) expectation and wasted the time
+and effort of `pip`'s maintainers reviewing dozens of PRs I've made
+in the last three months.
+
+On the bright side, this has been an opportunity for me to explore the codebase
+of package manager and discovered various edge cases where the new resolver
+has yet to cover (e.g. I've just noticed that `pip download` would save
+to-be-discarded distributions, I'll file an issue on that soon).  Plus I got
+to know many new and cool people and idea, which make me a more helpful
+individual to work on Python packaging in the future, I hope.
+
+[TensorFlow]: https://www.tensorflow.org
+[axuy]: https://www.youtube.com/playlist?list=PLAA9fHINq3sayfxEyZSF2D_rMgDZGyL3N