• Categories

  • Archives

Cool results in represent

This post right now is just a quick update with an example python session showing the cool things we can do in represent now, after a very hectic week of code writing. I’m currently in the process of finalizing things in the way that represent works, so I will post a much more detailed post at some point later, but right now I just want to show that we can actually do quantum mechanics in continuous bases! (In this case, with the particle in a box system).

Here is an example python session with notes on the output:

>>> from sympy.physics.quantum import *
>>> from sympy.physics.quantum.piab import *
>>> wf = represent(PIABKet())
>>> wf
Wavefunction(2**(1/2)*(1/L)**(1/2)*sin(pi*n*x_1/L), (x_1, 0, L))
>>> wf.norm
1

We get a particle-in-the-box Wavefunction when it is represented and it’s properly normalized.

>>> represent(XOp()*PIABKet())
Wavefunction(2**(1/2)*x_2*(1/L)**(1/2)*sin(pi*n*x_2/L), x_2)

We now get a wavefunction with an extra factor of x!

>>> represent(PxOp()*PIABKet(), basis=XKet)
Wavefunction(-2**(1/2)*hbar*I*pi*n*(1/L)**(1/2)*cos(pi*n*x_2/L)/L, x_2)

The PxOp actually takes the derivative of the wavefunction correctly! (Momentum operators in the position basis are differential operators).

>>> represent(PIABBra()*XOp()*PIABKet(), basis=XKet)
0

Here, we insert two unities. The first one integrated collapses a delta function, but the second one integrated actually computes the expectation value of x for the particle-in-a-box wavefunction (which is what you expect from representing <psi|X|psi>).

EDIT: As you’ll see in the comments, Raoul pointed out that this is actually incorrect, and I will be looking into it!

There are still a few kinks being worked out, but we’re very close to having a nice finished product which is why I am reserving a longer post for tomorrow. I am quite pleased with the results we’re seeing so far though!

An altered internal API for default representations and fun with Wavefunctions

This week was a very, very busy week. While there were only a handful of commits, they were probably some of the most crucial ones yet in bringing this home and being able to start writing example systems. With pencils down only two weeks away, it is seriously crunchtime.

A very nice development was the incorporation of Wavefunctions (Wf) and Differential Operators (DO) into the main represent logic. If you now try to represent anything in cartesian.py, you will get expressions that actually include Wf and have the DO applied appropriately! (I will soon be adding support for a qapply flag, which if set to false will leave all DOs intact without applying them.) Support for this included adding calls to qapply as well a function which unwraps all Wf around expressions before integrating over unities, then rewrapping the final result in a single Wf object.

With this done, I moved on to another quite important task in the represent logic that I discussed with Brian. This involves dealing with the ambiguous case in represent where no basis is specified in the initial call to represent. The current way that the quantum modules are set up to handle this logic involves the states having an internal _represent_default_basis method, which simply calls the appropriate _represent_FOO method internally in the class. This entire process is basically a black box to the outer represent logic. This becomes a problem in continuous bases, where we actually need to know which basis was chosen for representing (in order to know if there were any unities inserted and integrate appropriately). This is also a more general problem in that if you have an arbitrary expression, you don’t know if all of the internal _represent_default_basis calls actually represent the individual QExprs in the same basis! You could, in fact, have one QExpr represented in a different basis than another and end up with some very weird things.

So, there are really two problems exposed by this API weakness, and we need to address them both. First, the main represent logic needs to know what the default basis for a given QExpr is. Second, in the case of arbitrary quantum expressions, we need to choose one basis to represent all of the individual QExprs in.

Brian and I have discussed strategies to address both of these issues. The first is a change in the internal API for default representations. Rather than using the current _represent_default_basis, classes should now contain an internal _get_default_basis, which simply returnsĀ  the class which is meant to be the default basis for that QExpr. This way, we can still replicate the previous behavior, but the main represent logic now knows which basis is the default for that class. Because this involves changing an already quite large base of classes in the spin and quantum computing modules, this was quite an arduous task. After much testing and grappling with errors, the spin and cartesian classes now follow the new represent conventions for this, with the quantum computing classes soon to follow. (I should note, that as these changes were made, the spin classes were also changed to follow additional new conventions for represent. The first is that internally, the basis option is converted to a basis state rather than left as an operator in represent. This means that all _represent_Op methods had to be changed to _represent_Ket methods. Second, rather than being able to specify a single operator, you must now specify the complete commuting set of operators for a given eigenket. This means, for the spin classes, rather than being able to pass basis=Jx, you now pass basis=set([J2, Jx]).)

The second change is to have represent choose which basis to represent in during the first step of the recursion. With the current algorithm, _represent_default_basis was simply called at the lowest level of the recursion, on the individual QExpr. Now, the basis is chosen based on the default basis of the first QExpr in the expression, and all QExprs in the arbitrary expression are represented in this basis. This means all the representations will be consistent and we aren’t left with any strange final expressions. This change in the represent logic is next on my TODO list.

It is really nice to see all of the pieces of a now very robust representation logic coming together. The schedule I have been following is not quite what I initially planned, but the changes to represent logic for continuous bases ended up being way more subtle than what I initially anticipated. With all the possibilities for different quantum systems out there, I really envision that at the end of the day this revamped represent will be able to handle what is thrown at it. By GSoC pencils down, I will have at least a few different nice example systems implemented and some very clear documentation. Documenting this is very important to me because I’ve put a ton of time into making sure it will work in very general cases (with Brian’s amazing help and discussion), so making sure users know how it works and how to use its full power is the most important thing to me.

Incorporating Wavefunction and DifferentialOperator into representations

This post is a bit late because I’ve been experiencing some computer troubles recently.

Since my last post, the represent branch has been merged into sympy master! There are still some issues with representing continuous bases that need to be ironed out, so I have started a represent2 branch to deal with some of these changes.

The first commits in the branch are some simple fixes to docstrings and cleanup of the interfaces of Wavefunction (Wf) and DifferentialOperator (DO). Of note are the addition of an _eval_expand to Wf and DO, so that we can do simplifications like expand(Wavefunction((x+y)**2, x, y)) == Wavefunction(x**2, x, y) + Wavefunction(y**2, x, y) + Wavefunction(2*x*y, x, y).

The biggest tasks for represent2 are to incorporate the new Wf and DO classes into the internal represent logic of classes in quantum. Another big task is to simplify the logic of represent. In particular, rep_innerproduct and rep_expectation will become helper functions rather than being called from the main represent logic. This simplifies the main logic, but retains these methods to be called from the internal _represent methods. This is important because these functions know about proper indexing to form representations in these very standard forms, and this means we don’t have to reproduce that logic internally.

Both of these tasks are proving to be quite tricky and time consuming. I hope to have them done in the next couple of days. If that is the case, I expect to be able to rapidly move on to implementing coordinate systems and finally getting some example textbook quantum systems up and running. With only a few weeks of GSoC left, I’ll have to work extra hard to make that happen. Modifying the represent logic is taking much longer than expected, but we’re dealing with very complicated logic here and we want to make sure its robust. I’m confident, though, that I can have a nice finished product by the end of GSoC.

Summarizing the changes in represent

As the pending pull request gets closer to being merged, I thought it might be helpful to myself and others interested in the code if I summarized the changes that I have made to the code in represent.py. Hopefully people can use this post as a reference as they start digging into that portion of the code in reviewing the PR.

Represent has behavior to handle almost any kind of expression, from Add, TensorProduct, or Mul to the basic QExpr. Most of the more complicated types of expressions, like Add, recursively call represent on smaller atoms of the expression, hoping to ultimately get to a single QExpr, which is a sort of termination point in the recursion. The logic for handling a single QExpr is what I mainly modified in my work. Previously, the QExpr handling was a simple attempt to call expr._represent. Now, however, the following sequence occurs:

  • First, call get_basis. This function exists to unify different ways of calling represent. Whether represent is called with an operator or state class/instance as the basis in options, get_basis will examine this and try to return a state instance. At this point, the replace_none option is set to False. This is in place because represent_default_basis will only be called from expr._represent if there is no basis specified. Since the basis that get_basis returns may not be the one called by represent_default_basis, we want to make sure that if the basis in options is None, represent_default_basis gets called.
  • After setting the basis appropriately, call expr._represent. If this is successful, return the result
  • If a NotImplementedError (NIE) is raised, we set the replace_none option to True so that any future calls to get_basis will actually fill a basis into the options. At this point, any attempted call to represent_default_basis has failed, so we should now try representing in a basis state that we know.
  • If the passed QExpr is a Ket or Bra, we then call rep_innerproduct, which will return an innerproduct (<x’|x>) type representation for the given Ket
  • If the passed QExpr is an Operator, we call rep_expectation, which will return an expectation value like representation (<x’ | A | x>) for the given Operator

In addition to QExpr handling, there were some additions to the processing of Muls. Originally, represent is called on each arg of the Mul individually. While this still happens, there is code added to keep track of the unities that have been inserted and the current index that should be being appended to dummy kets. Details of where the unities are inserted are in this previous blog post.

Finally, before the result is returned, integrate_result is now called as well. If the expression passed to it is an Expr (presumably a continuous result), then it integrates over any unities (e.g. |x_1><x_1|) which were inserted into the quantum expression. This collapses some of the Delta functions that were originally in the expression.

To see the latest represent, check out the version in my branch here

A week of cleanup

This week was cut short by storms in the Midwest again. I lost power and internet from Monday until Wednesday, so I was stuck trying to find local cafes that still had power and also had internet. I did manage to have a very productive conversation with Brian about things that should still be cleaned up. I will post a more detailed description of the changes once I’ve finished the todo list we came up with, but for right now, some general themes that I’ve been working on include:

  • Docstring cleanup throughout the PR: as this is my first time working with Sympy, I’m still figuring out how to write good docs that conform to the standards throughout the code base. As such, alot of work will go into writing better descriptions and examples, as well as cleaning up the formatting.
  • More test coverage: There are many tests that have been added, but its time to start thinking more generally and come up with more complicated tests. Examples of tests that were added include support for Bras in the operator <-> state mapping as well as testing multi-dimensional Lapalacians in DifferentialOperator.
  • API streamlining: the API for DifferentialOperator described in my last post was a bit cludgy, so we’ve now reduced it to be just the general expression form. This might be more verbose, but it also simplifies the internal code quite a bit. In Wavefunction, I’ve cleaned up the normalization a bit, as well as adding a normalize() function which will return a normalized Wavefunction. Things like _eval_dagger and _eval_conjugate are defined now as well.

These changes were mainly to operatorset.py, Wavefunction, and DifferentialOperator. You can see some commits here, here, and here. There’s still a bit left to do, mostly with docstrings and test coverage, and I will talk more about this later. This week was also the GSoC midterm evaluation, so I will try to post a summary of the first half and the changes that I have made as well.

Revamped DifferentialOperator and pull request cleanup

The first task that I handled this week was to generalize DifferentialOperator so that it can handle more complicated expressions like a Laplacian or something similar. Its constructor now operates in three different modes: (see the commit here)

  1. A single symbol or string: this represents the variable to differentiate with respect to. This a simple \frac{d}{dx} type operator.
  2. A tuple with two entries: the first entry of the tuple is the same as in 1), but the second is an integer representing the order of the derivative to be taken. Therefore, DifferentialOperator((“x”, 2)) would represent \frac{d^2}{dx^2}
  3. Two arguments, an Expr and a tuple: the first entry is a general expression involving a Function. The second argument is a tuple. The first entry in the tuple is the symbol of the Function in argument 1 which is to be replaced by the Wavefunction we apply this operator to. The second entry in the tuple is the symbol which the function is evaluated with before replacement. So, another way of expressing a single derivative would be DifferentialOperator(Derivative(f(x), x), (f, x)) were f = Function(‘f’) and x = Symbol(‘x’). This allows us to put any arbitrary expression in and have it applied to the Wavefunction accordingly.

Once the new DifferentialOperator was implemented, it was time to add this functionality to cartesian.py. As a result, it is now possible to represent X operators in the momentum basis and vice versa!

In addition to the DifferentialOperator changes, this week was about coming up with more tests and trying to find bugs in corner cases of representations. One bug related to integration in sympy was discovered when testing out representations that involve DifferentialOperator. DifferentialOperator is non-commutative, and as_independent, which is called by the integration routine, would treat these operators as if they did not depend on the variable of integration even when they did. I opened issue 2549 here and then shortly after had a pull request here with a solution that eventually got merged in. This meant that DifferentialOperators were now integrated over correctly, so more tests were added that took advantage of this new functionality.

I had a brief discussion with Brian on IRC about outstanding issues in the PR, and one thing that we both agreed on was the the enumerate_states function of represent.py would not work for very general kets. The way it was previously structured was to simply take the labels of kets and append indices to the end of them. This works for continuous bases, and is useful when trying to form an inner product like \langle x_1 | x \rangle, but it’s completely useless for kets of discrete bases. This commit factors the actual enumeration of the states out to an internal ket function, leaving represent.py’s enumerate_states as the public interface. I think this design is ideal, because this means that each ket can decide for itself how to handle a request for a certain number of kets from the basis.

Hopefully Brian and I can discuss any leftover outstanding issues with my PR soon and we can get it merged in. After that happens, I’m going to start implementing many different coordinate systems and lay the groundwork for being able to handle many more example systems. This point in my schedule is when I already wanted to be implementing the coordinate systems, so there’s a chance that I may fall a bit behind that schedule. Luckily, I incoporated a 3 week buffer into the end of the timeline, just in case that happened, so I think I’m still on track to get a nice working product by the end of the summer.

Mapping states to operators and other pull request changes

This week has been a very busy and very productive week. I submitted my first big pull requests for the changes to representations that this blog describes, and have had many discussions with Brian about issues and changes to that code. The pull request is here. This post will try to summarize some of the issues that were raised and point to the commits that address those issues.

Originally, I had written an API in StateBase for retrieving the operator associated with that state, and a corresponding API in operator for retrieving the ket associated with that Operator. Brian quickly realized, however, that this is not the best design for the sorts of things we’ll be working with, because many kets (particularly when you get to 2D or 3D systems) will actually be simultaneous eigenkets of multiple operators. So, instead of the previous API, I wrote a new file, operatorset.py, which contains operator_to_state and state_to_operator mapping functions. This is nice, because in making the file we define a global dictionary with all of the appropriate mappings of classes, so that in one place there’s a complete record of these correspondences. In place of the _get_basis_kets function, which was important for trying to form inner products or expectations values, I added an enumerate_states method to represent.py with essentially the same functionality.

This seemed to be working quite well, but I soon stumbled upon an issue related to passing Bra states into the state_to_operator global function. The dictionary for mapping states to operators simply has ket classes and their corresponding operators. So, to map a bra state to an operator, we need to know its dual_class. If someone passes a Bra instance, this is fine, because dual_class is a property. However, if they instead pass a Bra class, we can’t call dual_class because its a property! So, one task in this pull request was to actually change dual_class to a classmethod throughout sympy.physics.quantum, which worked perfectly.

Some other concerns were raised as well. To see the full set of commits, you can look at the PR, but some other ones included:

  • Commutativity of Wavefunctions and DifferentialOperators: If we represent an expression in terms of Wavefunctions and DifferentialOperators, then we want that expression to preserve its order so that we can call qapply on it. DifferentialOperators are already set to be non-commutative, but Wavefunctions inherit from Function, which has its own rules for commutativity. So, one simple fix was to override Function’s is_commutative in Wavefunction. Related to that, DifferentialOperator was changed to return a Wavefunction rather than an expression when applied to a Wavefunction. If we have two consecutive derivatives, this is very important for that to actually work, since we have an _apply_operator_Wavefunction defined in DifferentialOperator.
  • default_label changed to default_args: The default_label functionality was changed to default_args, which now returns a tuple rather than a simple string. If no args are passed, the QExpr constructor will try to call default_args for that object to create a default instance. Another notable change is that, previously, I had a default_label defined in StateBase which would try, by default, to simply lowercase the default labels of its corresponding operator. It turns out that this is not at all general, so we instead opt for having each Ket and Operator define its own default_args. This will be the most flexible framework for being able to handle all kinds of complicated states. Some examples using this API are in cartesian.py
  • default_args added to many base classes: Once the default_args API was well defined, I added default_args for Operator, Ket, and TimeDependentKet, with some tests. In Bra, I put a default_args which simply returns the default_args of its dual_class. This can be overridden for special cases, butĀ  I think it is a good default case. It is probably redundant to have to define default_args in both a Bra and Ket class.

Many changes have been made, and I think the PR is almost ready to be merged. Brian also raised some concerns about being able to handle more general DifferentialOperators, like Laplacians. This will require some more thinking on my part, and will probably be a task for next week.

Debugging DiracDelta integration and next steps in representations

 

This week has been pretty eventful, though not much code itself has been written. In Illinois we had some storms early in the week which knocked out my internet for a few days…but now I’m back and up and running.

I mentioned in my last post that I found a bug in the integration of DiracDelta functions. I had my first branch of GSoC merged into sympy with a fix for this, but then found out that the tests that I had written for it were failing in master for some other users! The problem that I fixed was that the when going through a Mul with multiple DiracDeltas, deltaintegrate would only keep the last one it found and completely throw out the rest. Instead, I had it collapse the first one it found with the integrated variable in it, and keep the rest with the appropriate subsitutions. This comes from the identity that

\int f(x) \delta(x-y) dx = f(y)

If f(x) has other delta functions inside of it, the identity still applies!

The problem with my fix turned out to be a sorting issue! The test that I had written was tailored for the results on my computer, but the args of the Mul were being sorted differently on others’ computers so that the test ended up failing. I then opened another pull request with code that used sympy’s default_sort_key in deltaintegrate to make sure all the arguments of the Mul were sorted the same way before the function was integrated, and that fixed the issue!

In addition to that, I was able to touch base with my mentor Brian Granger and he gave me some good suggestions on changes to make to my existing code and also some new directions to take in representing continuous bases. I think these will lay good groundwork for starting to actually implement multiple coordinate systems and the canonical quantum systems (particle in a box, harmonic oscillator, etc.). These changes include:

  1. Rather than having an API to get the basis kets of an operator, it makes more sense to have an API to get the basis operators of a given ket. This is because in higher dimensions, position kets are actually simultaneous eigenkets of multiple operators. To this end, there will be a global function which can map between states and sets of operators, or vice versa.
  2. In addition to this, rather than just taking an operator class as the specification for a basis in represent, represent will now be able to take three things as inputs for the basis: a single operator, a set of operators, or a state. It will then use the global function to map between them.
  3. The default labeling systems needs to be revamped with states with multiple labels in mind. It makes more sense to define it as something like “default_args” and have it always return a tuple. More tests with time-dependent states will also help make sure this is clear

I have also been starting to think about how to handle implementing the various coordinate systems that will need to be done. It might make sense to have a general Position type class, which can then me called to have a request coordinate set returned. This needs to be fleshed out a bit more though. First, I will work on making the changes Brian suggested and submit my first big GSoC pull request.

Representing arbitrary quantum expressions in sympy

While the code in sympy.physics.quantum is currently very good at representations in discrete Hilbert spaces, representing expressions in continuous Hilbert spaces requires alot of legwork. The work I’ve done so far does a good job on representing simple expressions, but what about some long arbitrary expression?

Brian gave a good example. Let’s say we want to represent the expression

A B|\psi\rangle\langle\phi|C D|\alpha\rangle\langle\beta|

in the position basis. The easiest way to do this is just to insert unities in the appropriate places. So, we try to evaluate

\langle x_7 | A | x_6\rangle\langle x_6|B|x_5\rangle\langle x_5|\psi\rangle\langle\phi|x_4\rangle\langle x_4|C|x_3\rangle\langle x_3 | D | x_2 \rangle \langle x_2|\alpha\rangle\langle\beta|x_1\rangle

where we then integrate over x_2,x_3,x_4,x_5 and x_6

Let’s now consider a simpler case, where it’s also easier to see directly the expected result. How might we go about representing the expression

X|x\rangle

where |x\rangle is an eigenket of the position operator X. The easiest way to do this is to just bring in a position bra from the left, to compute

\langle x_1 | X | x\rangle = x\delta(x-x_1)

Let’s make sure this is consistent with what we get if we instead insert a unity like we would with any arbitrary expression:

\langle x_2 | X | x_1 \rangle \langle x_1 | x \rangle = x_1\delta(x_2-x_1)\delta(x-x_1)

If we then integrate over x_1

= x\delta(x-x_2)

Since x_1 and x_2 are just dummy variables, this is indeed consistent.

How do we then implement this in represent.py? Well, first, its useful to change the get_basis_kets in HermitianOperator a bit. Instead of just taking a number of kets, it now takes a number of kets AND a start index. That way, if we recursively keep track of the index in the options of represent, when rep_innerproduct or rep_expectation are called they know which basis kets they should be using for their calculations.

If we are keeping track of an index for kets, let’s think about when we should increment this index. There are a few key cases to consider (reading the expression from right to left):

  1. Ket followed by an operator \left(X|x\rangle\right): Here, the ket and the operator should be using the same index for their kets. The representation of |x\rangle is \langle x_1| x \rangle and representation of X will start with the same ket (\langle x_2 | X | x_1 \rangle). In this case, we don’t increment our index in between processing the ket and the operator.
  2. Bra followed by a ket \left(|x\rangle\langle x'|\right): In this case, the two representations should use different kets (there’s nowhere to insert a unity), so we should increment our index before processing the ket, thus calculating \langle x_2 | x \rangle \langle x' | x_1 \rangle
  3. Operator followed by a bra \left(\langle x| X\right): This is a case where we are inserting a unity, but we still need to increment the index we’re keeping track of. That’s because the representation of X starts with an |x_1\rangle ket on the right side, so the index when we take this representation is just 1. But, for representing the bra, we take $\langle x|x_2\rangle$, so we increment the index. Thus, we can take the representation as \langle x | x_2 \rangle \langle x_2 | X | x_1 \rangle.
  4. Operator followed by an operator \left(AB\right): We again have to increment the index even though we’re inserting a unity. The start index of the expectation value for the first operator will be one less than the second, so we end up calculating \langle x_3 | A | x_2\rangle \langle x_2 | B | x_1 \rangle.

Now that we know how to insert unities, we also need to keep track of which kets we inserted so that we can integrate over the appropriate coordinates. In the case of the position basis, there will be many delta functions that come out of that, but most of them collapse out when you actually do the integral. I first tried using integrate() on the resulting expression directly, but found a bug where the delta functions were incorrectly collapsed. An example:

>>> x, x_1, x_2 = symbols(‘x,x_1,x_2’)

>>> f = x_2 * DiracDelta(x-x_2) * DiracDelta(x_2-x_1)

>>> integrate(f, (x_2, -oo, oo))

x_1

We expect, instead, either x*DiracDelta(x-x_1) or x_1*DiracDelta(x-x_1). As a first approach to this, and collapsing dirac deltas in general, I wrote a collapse_deltas function, which, given a list of unities, would collapse the delta functions. I later discovered that a function in the integrals module, deltaintegrate(), already accomplished much of this, so I instead created a new branch and pushed a bug fix for the bug I show above.

Phew. That was alot. The code for the updated represent logic can be seen in my represent branch here. The pull request for my DiracDelta bug fix is here.

Further changes to representations

This past week brought further changes to how representations are done in sympy.physics.quantum, as well as some basic tests and doctests.

First, I added some basic tests for the default label functionality that I talked about last week. Those are here.

Beyond that, I added code to represent which would allow for calculation of operator representations, in the form of <x’ | A | x>. There is still a question, which was discussed over the list, of how to deal with representations of arbitrary expressions. I think the idea is simply to try to apply a bra from the basis you are attempting to represent in on the left of the expression. One small subtlety is that this would mean that the calling represent with the position basis on A | x> would give you the same result as calling represent on just A (that is, <x’ | A | x>). The code for that, and tests, is in this commit.

Next, I set about making some basic Wavefunction and DifferentialOperator classes. There was some debate as to whether Wavefunction is needed at all, and I think there are some justifications for it. First of all, it would mean that all representations in continuous bases could have the same class Rather than returning simple expressions, a function, which could be evaluated at any point would be returned. Additionally, this class can have extra convenience functions for calculating normalization constants or calculating quantum probabilities. It could also internally handle things like indexing of states. Finally, it makes writing DifferentialOperator a bit easier. In fact, it is as simple as implementing an _apply_operator_Wavefunction method, to calculate the derivative. There has still been debate over the implementation of Wavefunction, whether it should inherit from Lambda or Function. I have been trying to sort out implementation issues like this, because Function itself is not callable so I need to define my own __call__ method, which essentially duplicates functionality that already exists in Lambda. Initial code for this is here, with the first tests here. Some changes that I made today after reading discussions on the list, are here.

In the next week I’d really like to nail down the implementation of Wavefunction, as well as an example using the already written code in piab.py and changing represent to use Wavefunction. Hopefully by the end of the week I can have a nice example and some pretty plots!