Dataset Viewer
instance_id
stringlengths 10
57
| text
stringlengths 1.86k
8.73M
| repo
stringlengths 7
53
| base_commit
stringlengths 40
40
| problem_statement
stringlengths 23
37.7k
| hints_text
stringclasses 300
values | created_at
stringdate 2015-10-21 22:58:11
2024-04-30 21:59:25
| patch
stringlengths 278
37.8k
| test_patch
stringlengths 212
2.22M
| version
stringclasses 1
value | FAIL_TO_PASS
sequencelengths 1
4.94k
| PASS_TO_PASS
sequencelengths 0
7.82k
| environment_setup_commit
stringlengths 40
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
barrust__pyspellchecker-87 | You will be provided with a partial code base and an issue statement explaining a problem to resolve.
<issue>
English spellchecking
Hello Team!
I am new to the Project and I have a question.
I use python 3.7 and run into problem with this test program:
``` python
from spellchecker import SpellChecker
spell = SpellChecker()
split_words = spell.split_words
spell_unknown = spell.unknown
words = split_words("That's how t and s don't fit.")
print(words)
misspelled = spell_unknown(words)
print(misspelled)
```
With pyspellchecker ver 0.5.4 the printout is:
``` python
['that', 's', 'how', 't', 'and', 's', 'don', 't', 'fit']
set()
```
So free standing 't' and 's' are not marked as errors neither are contractions.
If I change the phrase to:
```python
words = split_words("That is how that's and don't do not fit.")
```
and use pyspellchecker ver 0.5.6 the printout is:
``` python
['that', 'is', 'how', 'that', 's', 'and', 'don', 't', 'do', 'not', 'fit']
{'t', 's'}
```
So contractions are marked as mistakes again.
(I read barrust comment on Oct 22, 2019}
Please, assist.
</issue>
<code>
[start of README.rst]
1 pyspellchecker
2 ===============================================================================
3
4 .. image:: https://img.shields.io/badge/license-MIT-blue.svg
5 :target: https://opensource.org/licenses/MIT/
6 :alt: License
7 .. image:: https://img.shields.io/github/release/barrust/pyspellchecker.svg
8 :target: https://github.com/barrust/pyspellchecker/releases
9 :alt: GitHub release
10 .. image:: https://github.com/barrust/pyspellchecker/workflows/Python%20package/badge.svg
11 :target: https://github.com/barrust/pyspellchecker/actions?query=workflow%3A%22Python+package%22
12 :alt: Build Status
13 .. image:: https://codecov.io/gh/barrust/pyspellchecker/branch/master/graph/badge.svg?token=OdETiNgz9k
14 :target: https://codecov.io/gh/barrust/pyspellchecker
15 :alt: Test Coverage
16 .. image:: https://badge.fury.io/py/pyspellchecker.svg
17 :target: https://badge.fury.io/py/pyspellchecker
18 :alt: PyPi Package
19 .. image:: http://pepy.tech/badge/pyspellchecker
20 :target: http://pepy.tech/count/pyspellchecker
21 :alt: Downloads
22
23
24 Pure Python Spell Checking based on `Peter
25 Norvig's <https://norvig.com/spell-correct.html>`__ blog post on setting
26 up a simple spell checking algorithm.
27
28 It uses a `Levenshtein Distance <https://en.wikipedia.org/wiki/Levenshtein_distance>`__
29 algorithm to find permutations within an edit distance of 2 from the
30 original word. It then compares all permutations (insertions, deletions,
31 replacements, and transpositions) to known words in a word frequency
32 list. Those words that are found more often in the frequency list are
33 **more likely** the correct results.
34
35 ``pyspellchecker`` supports multiple languages including English, Spanish,
36 German, French, and Portuguese. For information on how the dictionaries were created and how they can be updated and improved, please see the **Dictionary Creation and Updating** section of the readme!
37
38 ``pyspellchecker`` supports **Python 3**
39
40 ``pyspellchecker`` allows for the setting of the Levenshtein Distance (up to two) to check.
41 For longer words, it is highly recommended to use a distance of 1 and not the
42 default 2. See the quickstart to find how one can change the distance parameter.
43
44
45 Installation
46 -------------------------------------------------------------------------------
47
48 The easiest method to install is using pip:
49
50 .. code:: bash
51
52 pip install pyspellchecker
53
54 To install from source:
55
56 .. code:: bash
57
58 git clone https://github.com/barrust/pyspellchecker.git
59 cd pyspellchecker
60 python setup.py install
61
62 For *python 2.7* support, install `release 0.5.6 <https://github.com/barrust/pyspellchecker/releases/tag/v0.5.6>`__
63
64 .. code:: bash
65
66 pip install pyspellchecker==0.5.6
67
68
69 Quickstart
70 -------------------------------------------------------------------------------
71
72 After installation, using ``pyspellchecker`` should be fairly straight
73 forward:
74
75 .. code:: python
76
77 from spellchecker import SpellChecker
78
79 spell = SpellChecker()
80
81 # find those words that may be misspelled
82 misspelled = spell.unknown(['something', 'is', 'hapenning', 'here'])
83
84 for word in misspelled:
85 # Get the one `most likely` answer
86 print(spell.correction(word))
87
88 # Get a list of `likely` options
89 print(spell.candidates(word))
90
91
92 If the Word Frequency list is not to your liking, you can add additional
93 text to generate a more appropriate list for your use case.
94
95 .. code:: python
96
97 from spellchecker import SpellChecker
98
99 spell = SpellChecker() # loads default word frequency list
100 spell.word_frequency.load_text_file('./my_free_text_doc.txt')
101
102 # if I just want to make sure some words are not flagged as misspelled
103 spell.word_frequency.load_words(['microsoft', 'apple', 'google'])
104 spell.known(['microsoft', 'google']) # will return both now!
105
106
107 If the words that you wish to check are long, it is recommended to reduce the
108 `distance` to 1. This can be accomplished either when initializing the spell
109 check class or after the fact.
110
111 .. code:: python
112
113 from spellchecker import SpellChecker
114
115 spell = SpellChecker(distance=1) # set at initialization
116
117 # do some work on longer words
118
119 spell.distance = 2 # set the distance parameter back to the default
120
121
122 Dictionary Creation and Updating
123 -------------------------------------------------------------------------------
124
125 The creation of the dictionaries is, unfortunately, not an exact science. I have provided a script that, given a text file of sentences (in this case from
126 `OpenSubtitles <http://opus.nlpl.eu/OpenSubtitles2018.php>`__) it will generate a word frequency list based on the words found within the text. The script then attempts to ***clean up*** the word frequency by, for example, removing words with invalid characters (usually from other languages), removing low count terms (misspellings?) and attempts to enforce rules as available (no more than one accent per word in Spanish). Then it removes words from a list of known words that are to be removed.
127
128 The script can be found here: ``scripts/build_dictionary.py```. The original word frequency list parsed from OpenSubtitles can be found in the ```scripts/data/``` folder along with each language's *exclude* text file.
129
130 Any help in updating and maintaining the dictionaries would be greatly desired. To do this, a discussion could be started on GitHub or pull requests to update the exclude file could be added. Ideas on how to add words that are missing along with a relative frequency is something that is in the works for future versions of the dictionaries.
131
132
133 Additional Methods
134 -------------------------------------------------------------------------------
135
136 `On-line documentation <http://pyspellchecker.readthedocs.io/en/latest/>`__ is available; below contains the cliff-notes version of some of the available functions:
137
138
139 ``correction(word)``: Returns the most probable result for the
140 misspelled word
141
142 ``candidates(word)``: Returns a set of possible candidates for the
143 misspelled word
144
145 ``known([words])``: Returns those words that are in the word frequency
146 list
147
148 ``unknown([words])``: Returns those words that are not in the frequency
149 list
150
151 ``word_probability(word)``: The frequency of the given word out of all
152 words in the frequency list
153
154 The following are less likely to be needed by the user but are available:
155 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
156
157 ``edit_distance_1(word)``: Returns a set of all strings at a Levenshtein
158 Distance of one based on the alphabet of the selected language
159
160 ``edit_distance_2(word)``: Returns a set of all strings at a Levenshtein
161 Distance of two based on the alphabet of the selected language
162
163
164 Credits
165 -------------------------------------------------------------------------------
166
167 * `Peter Norvig <https://norvig.com/spell-correct.html>`__ blog post on setting up a simple spell checking algorithm
168 * P Lison and J Tiedemann, 2016, OpenSubtitles2016: Extracting Large Parallel Corpora from Movie and TV Subtitles. In Proceedings of the 10th International Conference on Language Resources and Evaluation (LREC 2016)
169
[end of README.rst]
[start of CHANGELOG.md]
1 # pyspellchecker
2
3 ## Future Release
4 * Updated automated `scripts/build_dictionary.py` script to support adding missing words
5
6 ## Version 0.6.0
7 * Remove **python 2.7** support
8
9 ## Version 0.5.6
10 * ***NOTE:*** Last planned support for **Python 2.7**
11 * All dictionaries updated using the `scripts/build_dictionary.py` script
12
13 ## Version 0.5.5
14 * Remove `encode` from the call to `json.loads()`
15
16 ## Version 0.5.4
17 * Reduce words in `__edit_distance_alt` to improve memory performance; thanks [@blayzen-w](https://github.com/blayzen-w)
18
19 ## Version 0.5.3
20 * Handle memory issues when trying to correct or find candidates for extremely long words
21
22 ## Version 0.5.2
23 Ensure input is encoded correctly; resolves [#53](https://github.com/barrust/pyspellchecker/issues/53)
24
25 ## Version 0.5.1
26 Handle windows encoding issues [#48](https://github.com/barrust/pyspellchecker/issues/48)
27 Deterministic order to corrections [#47](https://github.com/barrust/pyspellchecker/issues/47)
28
29 ## Version 0.5.0
30 * Add tokenizer to the Spell object
31 * Add Support for local dictionaries to be case sensitive
32 [see PR #44](https://github.com/barrust/pyspellchecker/pull/44) Thanks [@davido-brainlabs](https://github.com/davido-brainlabs)
33 * Better python 2.7 support for reading gzipped files
34
35 ## Version 0.4.0
36 * Add support for a tokenizer for splitting words into tokens
37
38 ## Version 0.3.1
39 * Add full python 2.7 support for foreign dictionaries
40
41 ## Version 0.3.0
42 * Ensure all checks against the word frequency are lower case
43 * Slightly better performance on edit distance of 2
44
45 ## Version 0.2.2
46 * Minor package fix for non-wheel deployments
47
48 ## Version 0.2.1
49 * Ignore case for language identifiers
50
51 ## Version 0.2.0
52 * Changed `words` function to `split_words` to differentiate with the `word_frequency.words` function
53 * Added ***Portuguese*** dictionary: `pt`
54 * Add encoding argument to `gzip.open` and `open` dictionary loading and exporting
55 * Use of __slots__ for class objects
56
57 ## Version 0.1.5
58 * Remove words based on threshold
59 * Add ability to iterate over words (keys) in the dictionary
60 * Add setting to to reduce the edit distance check
61 [see PR #17](https://github.com/barrust/pyspellchecker/pull/17) Thanks [@mrjamesriley](https://github.com/mrjamesriley)
62 * Added Export functionality:
63 * json
64 * gzip
65 * Updated logic for loading dictionaries to be either language or local_dictionary
66
67 ## Version 0.1.4
68 * Ability to easily remove words
69 * Ability to add a single word
70 * Improved (i.e. cleaned up) English dictionary
71
72 ## Version 0.1.3
73 * Better handle punctuation and numbers as the word to check
74
75 ## Version 0.1.1
76 * Add support for language dictionaries
77 * English, Spanish, French, and German
78 * Remove support for python 2; if it works, great!
79
80 ## Version 0.1.0
81 * Move word frequency to its own class
82 * Add basic tests
83 * Readme documentation
84
85 ## Version 0.0.1
86 * Initial release using code from Peter Norvig
87 * Initial release to pypi
88
[end of CHANGELOG.md]
[start of spellchecker/utils.py]
1 """ Additional utility functions """
2 import re
3 import gzip
4 import contextlib
5
6
7 def ensure_unicode(s, encoding='utf-8'):
8 if isinstance(s, bytes):
9 return s.decode(encoding)
10 return s
11
12
13 @contextlib.contextmanager
14 def __gzip_read(filename, mode='rb', encoding='UTF-8'):
15 """ Context manager to correctly handle the decoding of the output of \
16 the gzip file
17
18 Args:
19 filename (str): The filename to open
20 mode (str): The mode to read the data
21 encoding (str): The file encoding to use
22 Yields:
23 str: The string data from the gzip file read
24 """
25 with gzip.open(filename, mode=mode, encoding=encoding) as fobj:
26 yield fobj.read()
27
28
29 @contextlib.contextmanager
30 def load_file(filename, encoding):
31 """ Context manager to handle opening a gzip or text file correctly and
32 reading all the data
33
34 Args:
35 filename (str): The filename to open
36 encoding (str): The file encoding to use
37 Yields:
38 str: The string data from the file read
39 """
40 if filename[-3:].lower() == ".gz":
41 with __gzip_read(filename, mode='rt', encoding=encoding) as data:
42 yield data
43 else:
44 with open(filename, mode="r", encoding=encoding) as fobj:
45 yield fobj.read()
46
47
48 def write_file(filepath, encoding, gzipped, data):
49 """ Write the data to file either as a gzip file or text based on the
50 gzipped parameter
51
52 Args:
53 filepath (str): The filename to open
54 encoding (str): The file encoding to use
55 gzipped (bool): Whether the file should be gzipped or not
56 data (str): The data to be written out
57 """
58 if gzipped:
59 with gzip.open(filepath, 'wt') as fobj:
60 fobj.write(data)
61 else:
62 with open(filepath, "w", encoding=encoding) as fobj:
63 fobj.write(data)
64
65
66 def _parse_into_words(text):
67 """ Parse the text into words; currently removes punctuation
68
69 Args:
70 text (str): The text to split into words
71 """
72 return re.findall(r"\w+", text.lower())
73
[end of spellchecker/utils.py]
</code>
I need you to solve this issue by generating a single patch file that I can apply directly to this repository using git apply. Please respond with a single patch file in the following format.
<patch>
--- a/file.py
+++ b/file.py
@@ -1,27 +1,35 @@
def euclidean(a, b):
- while b:
- a, b = b, a % b
- return a
+ if b == 0:
+ return a
+ return euclidean(b, a % b)
def bresenham(x0, y0, x1, y1):
points = []
dx = abs(x1 - x0)
dy = abs(y1 - y0)
- sx = 1 if x0 < x1 else -1
- sy = 1 if y0 < y1 else -1
- err = dx - dy
+ x, y = x0, y0
+ sx = -1 if x0 > x1 else 1
+ sy = -1 if y0 > y1 else 1
- while True:
- points.append((x0, y0))
- if x0 == x1 and y0 == y1:
- break
- e2 = 2 * err
- if e2 > -dy:
+ if dx > dy:
+ err = dx / 2.0
+ while x != x1:
+ points.append((x, y))
err -= dy
- x0 += sx
- if e2 < dx:
- err += dx
- y0 += sy
+ if err < 0:
+ y += sy
+ err += dx
+ x += sx
+ else:
+ err = dy / 2.0
+ while y != y1:
+ points.append((x, y))
+ err -= dx
+ if err < 0:
+ x += sx
+ err += dy
+ y += sy
+ points.append((x, y))
return points
</patch>
| barrust/pyspellchecker | aa9668243fef58ff62c505a727b4a7284b81f42a | English spellchecking
Hello Team!
I am new to the Project and I have a question.
I use python 3.7 and run into problem with this test program:
``` python
from spellchecker import SpellChecker
spell = SpellChecker()
split_words = spell.split_words
spell_unknown = spell.unknown
words = split_words("That's how t and s don't fit.")
print(words)
misspelled = spell_unknown(words)
print(misspelled)
```
With pyspellchecker ver 0.5.4 the printout is:
``` python
['that', 's', 'how', 't', 'and', 's', 'don', 't', 'fit']
set()
```
So free standing 't' and 's' are not marked as errors neither are contractions.
If I change the phrase to:
```python
words = split_words("That is how that's and don't do not fit.")
```
and use pyspellchecker ver 0.5.6 the printout is:
``` python
['that', 'is', 'how', 'that', 's', 'and', 'don', 't', 'do', 'not', 'fit']
{'t', 's'}
```
So contractions are marked as mistakes again.
(I read barrust comment on Oct 22, 2019}
Please, assist.
| 2021-02-22 20:09:34+00:00 | <patch>
diff --git a/CHANGELOG.md b/CHANGELOG.md
index e4a891d..7aa5b30 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,10 +1,9 @@
# pyspellchecker
-## Future Release
-* Updated automated `scripts/build_dictionary.py` script to support adding missing words
-
-## Version 0.6.0
+## Version 0.6.0 (future version)
* Remove **python 2.7** support
+* Updated automated `scripts/build_dictionary.py` script to support adding missing words
+* Updated `split_words()` to attempt to better handle punctuation; [#84](https://github.com/barrust/pyspellchecker/issues/84)
## Version 0.5.6
* ***NOTE:*** Last planned support for **Python 2.7**
diff --git a/spellchecker/utils.py b/spellchecker/utils.py
index deafce4..e00484d 100644
--- a/spellchecker/utils.py
+++ b/spellchecker/utils.py
@@ -64,9 +64,11 @@ def write_file(filepath, encoding, gzipped, data):
def _parse_into_words(text):
- """ Parse the text into words; currently removes punctuation
+ """ Parse the text into words; currently removes punctuation except for
+ apostrophies.
Args:
text (str): The text to split into words
"""
- return re.findall(r"\w+", text.lower())
+ # see: https://stackoverflow.com/a/12705513
+ return re.findall(r"(\w[\w']*\w|\w)", text.lower())
</patch> | diff --git a/tests/spellchecker_test.py b/tests/spellchecker_test.py
index 165371a..b403117 100644
--- a/tests/spellchecker_test.py
+++ b/tests/spellchecker_test.py
@@ -191,7 +191,6 @@ class TestSpellChecker(unittest.TestCase):
cnt += 1
self.assertEqual(cnt, 0)
-
def test_remove_by_threshold_using_items(self):
''' test removing everything below a certain threshold; using items to test '''
spell = SpellChecker()
@@ -398,3 +397,9 @@ class TestSpellChecker(unittest.TestCase):
self.assertTrue(var in spell)
self.assertEqual(spell[var], 60)
+
+ def test_split_words(self):
+ ''' test using split_words '''
+ spell = SpellChecker()
+ res = spell.split_words("This isn't a good test, but it is a test!!!!")
+ self.assertEqual(set(res), set(["this", "isn't", "a", "good", "test", "but", "it", "is", "a", "test"]))
| 0.0 | [
"tests/spellchecker_test.py::TestSpellChecker::test_split_words"
] | [
"tests/spellchecker_test.py::TestSpellChecker::test_add_word",
"tests/spellchecker_test.py::TestSpellChecker::test_adding_unicode",
"tests/spellchecker_test.py::TestSpellChecker::test_bytes_input",
"tests/spellchecker_test.py::TestSpellChecker::test_candidates",
"tests/spellchecker_test.py::TestSpellChecker::test_capitalization_when_case_sensitive_defaults_to_false",
"tests/spellchecker_test.py::TestSpellChecker::test_capitalization_when_case_sensitive_true",
"tests/spellchecker_test.py::TestSpellChecker::test_capitalization_when_language_set",
"tests/spellchecker_test.py::TestSpellChecker::test_checking_odd_word",
"tests/spellchecker_test.py::TestSpellChecker::test_correction",
"tests/spellchecker_test.py::TestSpellChecker::test_edit_distance_invalud",
"tests/spellchecker_test.py::TestSpellChecker::test_edit_distance_one",
"tests/spellchecker_test.py::TestSpellChecker::test_edit_distance_one_property",
"tests/spellchecker_test.py::TestSpellChecker::test_edit_distance_two",
"tests/spellchecker_test.py::TestSpellChecker::test_extremely_large_words",
"tests/spellchecker_test.py::TestSpellChecker::test_import_export_gzip",
"tests/spellchecker_test.py::TestSpellChecker::test_import_export_json",
"tests/spellchecker_test.py::TestSpellChecker::test_large_words",
"tests/spellchecker_test.py::TestSpellChecker::test_load_external_dictionary",
"tests/spellchecker_test.py::TestSpellChecker::test_load_text_file",
"tests/spellchecker_test.py::TestSpellChecker::test_missing_dictionary",
"tests/spellchecker_test.py::TestSpellChecker::test_pop",
"tests/spellchecker_test.py::TestSpellChecker::test_pop_default",
"tests/spellchecker_test.py::TestSpellChecker::test_remove_by_threshold",
"tests/spellchecker_test.py::TestSpellChecker::test_remove_by_threshold_using_items",
"tests/spellchecker_test.py::TestSpellChecker::test_remove_word",
"tests/spellchecker_test.py::TestSpellChecker::test_remove_words",
"tests/spellchecker_test.py::TestSpellChecker::test_spanish_dict",
"tests/spellchecker_test.py::TestSpellChecker::test_tokenizer_file",
"tests/spellchecker_test.py::TestSpellChecker::test_tokenizer_provided",
"tests/spellchecker_test.py::TestSpellChecker::test_unique_words",
"tests/spellchecker_test.py::TestSpellChecker::test_unknown_words",
"tests/spellchecker_test.py::TestSpellChecker::test_word_contains",
"tests/spellchecker_test.py::TestSpellChecker::test_word_frequency",
"tests/spellchecker_test.py::TestSpellChecker::test_word_in",
"tests/spellchecker_test.py::TestSpellChecker::test_word_known",
"tests/spellchecker_test.py::TestSpellChecker::test_word_probability",
"tests/spellchecker_test.py::TestSpellChecker::test_words",
"tests/spellchecker_test.py::TestSpellChecker::test_words_more_complete"
] | aa9668243fef58ff62c505a727b4a7284b81f42a |
|
fatiando__verde-237 | You will be provided with a partial code base and an issue statement explaining a problem to resolve.
<issue>
Create a convexhull masking function
**Description of the desired feature**
A good way to mask grid points that are too far from data is to only show the ones that fall inside the data convex hull. This is what many scipy and matplotlib interpolations do. It would be great to have a `convexhull_mask` function that has a similar interface to `distance_mask` but masks points outside of the convex hull. This function should take the same arguments as `distance_mask` except `maxdist`.
One way of implementing this would be with the `scipy.spatial.Delaunay` class:
```
tri = Delaunay(np.transpose(data_coordinates))
# Find which triangle each grid point is in. -1 indicates that it's not in any.
in_triangle = tri.find_simplex(np.transpose(coordinates))
mask = in_triangle > 0
```
</issue>
<code>
[start of README.rst]
1 .. image:: https://github.com/fatiando/verde/raw/master/doc/_static/readme-banner.png
2 :alt: Verde
3
4 `Documentation <http://www.fatiando.org/verde>`__ |
5 `Documentation (dev version) <http://www.fatiando.org/verde/dev>`__ |
6 `Contact <http://contact.fatiando.org>`__ |
7 Part of the `Fatiando a Terra <https://www.fatiando.org>`__ project
8
9
10 .. image:: http://img.shields.io/pypi/v/verde.svg?style=flat-square&label=version
11 :alt: Latest version on PyPI
12 :target: https://pypi.python.org/pypi/verde
13 .. image:: http://img.shields.io/travis/fatiando/verde/master.svg?style=flat-square&label=TravisCI
14 :alt: TravisCI build status
15 :target: https://travis-ci.org/fatiando/verde
16 .. image:: https://img.shields.io/azure-devops/build/fatiando/066f88d8-0495-49ba-bad9-ef7431356ce9/7/master.svg?label=Azure&style=flat-square
17 :alt: Azure Pipelines build status
18 :target: https://dev.azure.com/fatiando/verde/_build
19 .. image:: https://img.shields.io/codecov/c/github/fatiando/verde/master.svg?style=flat-square
20 :alt: Test coverage status
21 :target: https://codecov.io/gh/fatiando/verde
22 .. image:: https://img.shields.io/codacy/grade/6b698defc0df47288a634930d41a9d65.svg?style=flat-square&label=codacy
23 :alt: Code quality grade on codacy
24 :target: https://www.codacy.com/app/leouieda/verde
25 .. image:: https://img.shields.io/pypi/pyversions/verde.svg?style=flat-square
26 :alt: Compatible Python versions.
27 :target: https://pypi.python.org/pypi/verde
28 .. image:: https://img.shields.io/badge/doi-10.21105%2Fjoss.00957-blue.svg?style=flat-square
29 :alt: Digital Object Identifier for the JOSS paper
30 :target: https://doi.org/10.21105/joss.00957
31
32
33 .. placeholder-for-doc-index
34
35
36 About
37 -----
38
39 Verde is a Python library for processing spatial data (bathymetry, geophysics
40 surveys, etc) and interpolating it on regular grids (i.e., *gridding*).
41
42 Most gridding methods in Verde use a Green's functions approach.
43 A linear model is estimated based on the input data and then used to predict
44 data on a regular grid (or in a scatter, a profile, as derivatives).
45 The models are Green's functions from (mostly) elastic deformation theory.
46 This approach is very similar to *machine learning* so we implement gridder
47 classes that are similar to `scikit-learn <http://scikit-learn.org/>`__
48 regression classes.
49 The API is not 100% compatible but it should look familiar to those with some
50 scikit-learn experience.
51
52 Advantages of using Green's functions include:
53
54 * Easily apply **weights** to data points. This is a linear least-squares
55 problem.
56 * Perform **model selection** using established machine learning techniques,
57 like k-fold or holdout cross-validation.
58 * The estimated model can be **easily stored** for later use, like
59 spherical-harmonic coefficients are used in gravimetry.
60
61 The main disadvantage is the heavy memory and processing time requirement (it's
62 a linear regression problem).
63
64
65 Project goals
66 -------------
67
68 * Provide a machine-learning inspired interface for gridding spatial data
69 * Integration with the Scipy stack: numpy, pandas, scikit-learn, and xarray
70 * Include common processing and data preparation tasks, like blocked means and 2D trends
71 * Support for gridding scalar and vector data (like wind speed or GPS velocities)
72 * Support for both Cartesian and geographic coordinates
73
74 The first release of Verde was focused on meeting these initial goals and establishing
75 the look and feel of the library. Later releases will focus on expanding the range of
76 gridders available, optimizing the code, and improving algorithms so that
77 larger-than-memory datasets can also be supported.
78
79
80 Contacting us
81 -------------
82
83 * Most discussion happens `on Github <https://github.com/fatiando/verde>`__.
84 Feel free to `open an issue
85 <https://github.com/fatiando/verde/issues/new>`__ or comment
86 on any open issue or pull request.
87 * We have `chat room on Slack <http://contact.fatiando.org>`__
88 where you can ask questions and leave comments.
89
90
91 Citing Verde
92 ------------
93
94 This is research software **made by scientists** (see
95 `AUTHORS.md <https://github.com/fatiando/verde/blob/master/AUTHORS.md>`__). Citations
96 help us justify the effort that goes into building and maintaining this project. If you
97 used Verde for your research, please consider citing us.
98
99 See our `CITATION.rst file <https://github.com/fatiando/verde/blob/master/CITATION.rst>`__
100 to find out more.
101
102
103 Contributing
104 ------------
105
106 Code of conduct
107 +++++++++++++++
108
109 Please note that this project is released with a
110 `Contributor Code of Conduct <https://github.com/fatiando/verde/blob/master/CODE_OF_CONDUCT.md>`__.
111 By participating in this project you agree to abide by its terms.
112
113 Contributing Guidelines
114 +++++++++++++++++++++++
115
116 Please read our
117 `Contributing Guide <https://github.com/fatiando/verde/blob/master/CONTRIBUTING.md>`__
118 to see how you can help and give feedback.
119
120 Imposter syndrome disclaimer
121 ++++++++++++++++++++++++++++
122
123 **We want your help.** No, really.
124
125 There may be a little voice inside your head that is telling you that you're
126 not ready to be an open source contributor; that your skills aren't nearly good
127 enough to contribute.
128 What could you possibly offer?
129
130 We assure you that the little voice in your head is wrong.
131
132 **Being a contributor doesn't just mean writing code**.
133 Equality important contributions include:
134 writing or proof-reading documentation, suggesting or implementing tests, or
135 even giving feedback about the project (including giving feedback about the
136 contribution process).
137 If you're coming to the project with fresh eyes, you might see the errors and
138 assumptions that seasoned contributors have glossed over.
139 If you can write any code at all, you can contribute code to open source.
140 We are constantly trying out new skills, making mistakes, and learning from
141 those mistakes.
142 That's how we all improve and we are happy to help others learn.
143
144 *This disclaimer was adapted from the*
145 `MetPy project <https://github.com/Unidata/MetPy>`__.
146
147
148 License
149 -------
150
151 This is free software: you can redistribute it and/or modify it under the terms
152 of the **BSD 3-clause License**. A copy of this license is provided in
153 `LICENSE.txt <https://github.com/fatiando/verde/blob/master/LICENSE.txt>`__.
154
155
156 Documentation for other versions
157 --------------------------------
158
159 * `Development <http://www.fatiando.org/verde/dev>`__ (reflects the *master* branch on
160 Github)
161 * `Latest release <http://www.fatiando.org/verde/latest>`__
162 * `v1.3.0 <http://www.fatiando.org/verde/v1.3.0>`__
163 * `v1.2.0 <http://www.fatiando.org/verde/v1.2.0>`__
164 * `v1.1.0 <http://www.fatiando.org/verde/v1.1.0>`__
165 * `v1.0.1 <http://www.fatiando.org/verde/v1.0.1>`__
166 * `v1.0.0 <http://www.fatiando.org/verde/v1.0.0>`__
167
[end of README.rst]
[start of /dev/null]
1
[end of /dev/null]
[start of doc/api/index.rst]
1 .. _api:
2
3 API Reference
4 =============
5
6 .. automodule:: verde
7
8 .. currentmodule:: verde
9
10 Interpolators
11 -------------
12
13 .. autosummary::
14 :toctree: generated/
15
16 Spline
17 SplineCV
18 VectorSpline2D
19 ScipyGridder
20
21 Data Processing
22 ---------------
23
24 .. autosummary::
25 :toctree: generated/
26
27 BlockReduce
28 BlockMean
29 Trend
30
31 Composite Estimators
32 --------------------
33
34 .. autosummary::
35 :toctree: generated/
36
37 Chain
38 Vector
39
40 Model Selection
41 ---------------
42
43 .. autosummary::
44 :toctree: generated/
45
46 train_test_split
47 cross_val_score
48
49 Coordinate Manipulation
50 -----------------------
51
52 .. autosummary::
53 :toctree: generated/
54
55 grid_coordinates
56 scatter_points
57 profile_coordinates
58 get_region
59 pad_region
60 project_region
61 inside
62 block_split
63 rolling_window
64 expanding_window
65
66 Utilities
67 ---------
68
69 .. autosummary::
70 :toctree: generated/
71
72 test
73 maxabs
74 distance_mask
75 variance_to_weights
76 grid_to_table
77 median_distance
78
79 Input/Output
80 ------------
81
82 .. autosummary::
83 :toctree: generated/
84
85 load_surfer
86
87
88 .. automodule:: verde.datasets
89
90 .. currentmodule:: verde
91
92 Datasets
93 --------
94
95 .. autosummary::
96 :toctree: generated/
97
98 datasets.locate
99 datasets.CheckerBoard
100 datasets.fetch_baja_bathymetry
101 datasets.setup_baja_bathymetry_map
102 datasets.fetch_california_gps
103 datasets.setup_california_gps_map
104 datasets.fetch_texas_wind
105 datasets.setup_texas_wind_map
106 datasets.fetch_rio_magnetic
107 datasets.setup_rio_magnetic_map
108
109 Base Classes and Functions
110 --------------------------
111
112 .. autosummary::
113 :toctree: generated/
114
115 base.BaseGridder
116 base.n_1d_arrays
117 base.check_fit_input
118 base.least_squares
119
[end of doc/api/index.rst]
[start of examples/spline_weights.py]
1 """
2 Gridding with splines and weights
3 =================================
4
5 An advantage of using the Green's functions based :class:`verde.Spline` over
6 :class:`verde.ScipyGridder` is that you can assign weights to the data to incorporate
7 the data uncertainties or variance into the gridding.
8 In this example, we'll see how to combine :class:`verde.BlockMean` to decimate the data
9 and use weights based on the data uncertainty during gridding.
10 """
11 import matplotlib.pyplot as plt
12 from matplotlib.colors import PowerNorm
13 import cartopy.crs as ccrs
14 import pyproj
15 import numpy as np
16 import verde as vd
17
18 # We'll test this on the California vertical GPS velocity data because it comes with the
19 # uncertainties
20 data = vd.datasets.fetch_california_gps()
21 coordinates = (data.longitude.values, data.latitude.values)
22
23 # Use a Mercator projection for our Cartesian gridder
24 projection = pyproj.Proj(proj="merc", lat_ts=data.latitude.mean())
25
26 # Now we can chain a block weighted mean and weighted spline together. We'll use
27 # uncertainty propagation to calculate the new weights from block mean because our data
28 # vary smoothly but have different uncertainties.
29 spacing = 5 / 60 # 5 arc-minutes
30 chain = vd.Chain(
31 [
32 ("mean", vd.BlockMean(spacing=spacing * 111e3, uncertainty=True)),
33 ("spline", vd.Spline(damping=1e-10)),
34 ]
35 )
36 print(chain)
37
38 # Split the data into a training and testing set. We'll use the training set to grid the
39 # data and the testing set to validate our spline model. Weights need to
40 # 1/uncertainty**2 for the error propagation in BlockMean to work.
41 train, test = vd.train_test_split(
42 projection(*coordinates),
43 data.velocity_up,
44 weights=1 / data.std_up ** 2,
45 random_state=0,
46 )
47 # Fit the model on the training set
48 chain.fit(*train)
49 # And calculate an R^2 score coefficient on the testing set. The best possible score
50 # (perfect prediction) is 1. This can tell us how good our spline is at predicting data
51 # that was not in the input dataset.
52 score = chain.score(*test)
53 print("\nScore: {:.3f}".format(score))
54
55 # Create a grid of the vertical velocity and mask it to only show points close to the
56 # actual data.
57 region = vd.get_region(coordinates)
58 grid_full = chain.grid(
59 region=region,
60 spacing=spacing,
61 projection=projection,
62 dims=["latitude", "longitude"],
63 data_names=["velocity"],
64 )
65 grid = vd.distance_mask(
66 (data.longitude, data.latitude),
67 maxdist=5 * spacing * 111e3,
68 grid=grid_full,
69 projection=projection,
70 )
71
72 fig, axes = plt.subplots(
73 1, 2, figsize=(9, 7), subplot_kw=dict(projection=ccrs.Mercator())
74 )
75 crs = ccrs.PlateCarree()
76 # Plot the data uncertainties
77 ax = axes[0]
78 ax.set_title("Data uncertainty")
79 # Plot the uncertainties in mm/yr and using a power law for the color scale to highlight
80 # the smaller values
81 pc = ax.scatter(
82 *coordinates,
83 c=data.std_up * 1000,
84 s=20,
85 cmap="magma",
86 transform=crs,
87 norm=PowerNorm(gamma=1 / 2)
88 )
89 cb = plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05)
90 cb.set_label("uncertainty [mm/yr]")
91 vd.datasets.setup_california_gps_map(ax, region=region)
92 # Plot the gridded velocities
93 ax = axes[1]
94 ax.set_title("Weighted spline interpolated velocity")
95 maxabs = vd.maxabs(data.velocity_up) * 1000
96 pc = (grid.velocity * 1000).plot.pcolormesh(
97 ax=ax,
98 cmap="seismic",
99 vmin=-maxabs,
100 vmax=maxabs,
101 transform=crs,
102 add_colorbar=False,
103 add_labels=False,
104 )
105 cb = plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05)
106 cb.set_label("vertical velocity [mm/yr]")
107 ax.scatter(*coordinates, c="black", s=0.5, alpha=0.1, transform=crs)
108 vd.datasets.setup_california_gps_map(ax, region=region)
109 ax.coastlines()
110 plt.tight_layout()
111 plt.show()
112
[end of examples/spline_weights.py]
[start of examples/vector_uncoupled.py]
1 """
2 Gridding 2D vectors
3 ===================
4
5 We can use :class:`verde.Vector` to simultaneously process and grid all
6 components of vector data. Each component is processed and gridded separately
7 (see `Erizo <https://github.com/fatiando/erizo>`__ for an elastically coupled
8 alternative) but we have the convenience of dealing with a single estimator.
9 :class:`verde.Vector` can be combined with :class:`verde.Trend`,
10 :class:`verde.Spline`, and :class:`verde.Chain` to create a full processing
11 pipeline.
12 """
13 import matplotlib.pyplot as plt
14 import cartopy.crs as ccrs
15 import numpy as np
16 import pyproj
17 import verde as vd
18
19
20 # Fetch the wind speed data from Texas.
21 data = vd.datasets.fetch_texas_wind()
22 print(data.head())
23
24 # Separate out some of the data into utility variables
25 coordinates = (data.longitude.values, data.latitude.values)
26 region = vd.get_region(coordinates)
27 # Use a Mercator projection because Spline is a Cartesian gridder
28 projection = pyproj.Proj(proj="merc", lat_ts=data.latitude.mean())
29
30 # Split the data into a training and testing set. We'll fit the gridder on the training
31 # set and use the testing set to evaluate how well the gridder is performing.
32 train, test = vd.train_test_split(
33 projection(*coordinates),
34 (data.wind_speed_east_knots, data.wind_speed_north_knots),
35 random_state=2,
36 )
37
38 # We'll make a 20 arc-minute grid
39 spacing = 20 / 60
40
41 # Chain together a blocked mean to avoid aliasing, a polynomial trend (Spline usually
42 # requires de-trended data), and finally a Spline for each component. Notice that
43 # BlockReduce can work on multicomponent data without the use of Vector.
44 chain = vd.Chain(
45 [
46 ("mean", vd.BlockReduce(np.mean, spacing * 111e3)),
47 ("trend", vd.Vector([vd.Trend(degree=1) for i in range(2)])),
48 (
49 "spline",
50 vd.Vector([vd.Spline(damping=1e-10, mindist=500e3) for i in range(2)]),
51 ),
52 ]
53 )
54 print(chain)
55
56 # Fit on the training data
57 chain.fit(*train)
58 # And score on the testing data. The best possible score is 1, meaning a perfect
59 # prediction of the test data.
60 score = chain.score(*test)
61 print("Cross-validation R^2 score: {:.2f}".format(score))
62
63 # Interpolate the wind speed onto a regular geographic grid and mask the data that are
64 # far from the observation points
65 grid_full = chain.grid(
66 region, spacing=spacing, projection=projection, dims=["latitude", "longitude"]
67 )
68 grid = vd.distance_mask(
69 coordinates, maxdist=3 * spacing * 111e3, grid=grid_full, projection=projection
70 )
71
72 # Make maps of the original and gridded wind speed
73 plt.figure(figsize=(6, 6))
74 ax = plt.axes(projection=ccrs.Mercator())
75 ax.set_title("Uncoupled spline gridding of wind speed")
76 tmp = ax.quiver(
77 grid.longitude.values,
78 grid.latitude.values,
79 grid.east_component.values,
80 grid.north_component.values,
81 width=0.0015,
82 scale=100,
83 color="tab:blue",
84 transform=ccrs.PlateCarree(),
85 label="Interpolated",
86 )
87 ax.quiver(
88 *coordinates,
89 data.wind_speed_east_knots.values,
90 data.wind_speed_north_knots.values,
91 width=0.003,
92 scale=100,
93 color="tab:red",
94 transform=ccrs.PlateCarree(),
95 label="Original",
96 )
97 ax.quiverkey(tmp, 0.17, 0.23, 5, label="5 knots", coordinates="figure")
98 ax.legend(loc="lower left")
99 # Use an utility function to add tick labels and land and ocean features to the map.
100 vd.datasets.setup_texas_wind_map(ax)
101 plt.tight_layout()
102 plt.show()
103
[end of examples/vector_uncoupled.py]
[start of tutorials/weights.py]
1 """
2 Using Weights
3 =============
4
5 One of the advantages of using a Green's functions approach to interpolation is that we
6 can easily weight the data to give each point more or less influence over the results.
7 This is a good way to not let data points with large uncertainties bias the
8 interpolation or the data decimation.
9 """
10 # The weights vary a lot so it's better to plot them using a logarithmic color scale
11 from matplotlib.colors import LogNorm
12 import matplotlib.pyplot as plt
13 import cartopy.crs as ccrs
14 import numpy as np
15 import verde as vd
16
17 ########################################################################################
18 # We'll use some sample GPS vertical ground velocity which has some variable
19 # uncertainties associated with each data point. The data are loaded as a
20 # pandas.DataFrame:
21 data = vd.datasets.fetch_california_gps()
22 print(data.head())
23
24 ########################################################################################
25 # Let's plot our data using Cartopy to see what the vertical velocities and their
26 # uncertainties look like. We'll make a function for this so we can reuse it later on.
27
28
29 def plot_data(coordinates, velocity, weights, title_data, title_weights):
30 "Make two maps of our data, one with the data and one with the weights/uncertainty"
31 fig, axes = plt.subplots(
32 1, 2, figsize=(9.5, 7), subplot_kw=dict(projection=ccrs.Mercator())
33 )
34 crs = ccrs.PlateCarree()
35 ax = axes[0]
36 ax.set_title(title_data)
37 maxabs = vd.maxabs(velocity)
38 pc = ax.scatter(
39 *coordinates,
40 c=velocity,
41 s=30,
42 cmap="seismic",
43 vmin=-maxabs,
44 vmax=maxabs,
45 transform=crs,
46 )
47 plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05).set_label("m/yr")
48 vd.datasets.setup_california_gps_map(ax)
49 ax = axes[1]
50 ax.set_title(title_weights)
51 pc = ax.scatter(
52 *coordinates, c=weights, s=30, cmap="magma", transform=crs, norm=LogNorm()
53 )
54 plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05)
55 vd.datasets.setup_california_gps_map(ax)
56 plt.tight_layout()
57 plt.show()
58
59
60 # Plot the data and the uncertainties
61 plot_data(
62 (data.longitude, data.latitude),
63 data.velocity_up,
64 data.std_up,
65 "Vertical GPS velocity",
66 "Uncertainty (m/yr)",
67 )
68
69 ########################################################################################
70 # Weights in data decimation
71 # --------------------------
72 #
73 # :class:`~verde.BlockReduce` can't output weights for each data point because it
74 # doesn't know which reduction operation it's using. If you want to do a weighted
75 # interpolation, like :class:`verde.Spline`, :class:`~verde.BlockReduce` won't propagate
76 # the weights to the interpolation function. If your data are relatively smooth, you can
77 # use :class:`verde.BlockMean` instead to decimated data and produce weights. It can
78 # calculate different kinds of weights, depending on configuration options and what you
79 # give it as input.
80 #
81 # Let's explore all of the possibilities.
82 mean = vd.BlockMean(spacing=15 / 60)
83 print(mean)
84
85 ########################################################################################
86 # Option 1: No input weights
87 # ++++++++++++++++++++++++++
88 #
89 # In this case, we'll get a standard mean and the output weights will be 1 over the
90 # variance of the data in each block:
91 #
92 # .. math::
93 #
94 # \bar{d} = \dfrac{\sum\limits_{i=1}^N d_i}{N}
95 # \: , \qquad
96 # \sigma^2 = \dfrac{\sum\limits_{i=1}^N (d_i - \bar{d})^2}{N}
97 # \: , \qquad
98 # w = \dfrac{1}{\sigma^2}
99 #
100 # in which :math:`N` is the number of data points in the block, :math:`d_i` are the
101 # data values in the block, and the output values for the block are the mean data
102 # :math:`\bar{d}` and the weight :math:`w`.
103 #
104 # Notice that data points that are more uncertain don't necessarily have smaller
105 # weights. Instead, the blocks that contain data with sharper variations end up having
106 # smaller weights, like the data points in the south.
107 coordinates, velocity, weights = mean.filter(
108 coordinates=(data.longitude, data.latitude), data=data.velocity_up
109 )
110
111 plot_data(
112 coordinates,
113 velocity,
114 weights,
115 "Mean vertical GPS velocity",
116 "Weights based on data variance",
117 )
118
119 ########################################################################################
120 # Option 2: Input weights are not related to the uncertainty of the data
121 # ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
122 #
123 # This is the case when data weights are chosen by the user, not based on the
124 # measurement uncertainty. For example, when you need to give less importance to a
125 # portion of the data and no uncertainties are available. The mean will be weighted and
126 # the output weights will be 1 over the weighted variance of the data in each block:
127 #
128 # .. math::
129 #
130 # \bar{d}^* = \dfrac{\sum\limits_{i=1}^N w_i d_i}{\sum\limits_{i=1}^N w_i}
131 # \: , \qquad
132 # \sigma^2_w = \dfrac{\sum\limits_{i=1}^N w_i(d_i - \bar{d}*)^2}{
133 # \sum\limits_{i=1}^N w_i}
134 # \: , \qquad
135 # w = \dfrac{1}{\sigma^2_w}
136 #
137 # in which :math:`w_i` are the input weights in the block.
138 #
139 # The output will be similar to the one above but points with larger initial weights
140 # will have a smaller influence on the mean and also on the output weights.
141
142 # We'll use 1 over the squared data uncertainty as our input weights.
143 data["weights"] = 1 / data.std_up ** 2
144
145 # By default, BlockMean assumes that weights are not related to uncertainties
146 coordinates, velocity, weights = mean.filter(
147 coordinates=(data.longitude, data.latitude),
148 data=data.velocity_up,
149 weights=data.weights,
150 )
151
152 plot_data(
153 coordinates,
154 velocity,
155 weights,
156 "Weighted mean vertical GPS velocity",
157 "Weights based on weighted data variance",
158 )
159
160 ########################################################################################
161 # Option 3: Input weights are 1 over the data uncertainty squared
162 # +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
163 #
164 # If input weights are 1 over the data uncertainty squared, we can use uncertainty
165 # propagation to calculate the uncertainty of the weighted mean and use it to define our
166 # output weights. Use option ``uncertainty=True`` to tell :class:`~verde.BlockMean` to
167 # calculate weights based on the propagated uncertainty of the data. The output weights
168 # will be 1 over the propagated uncertainty squared. In this case, the **input weights
169 # must not be normalized**. This is preferable if you know the uncertainty of the data.
170 #
171 # .. math::
172 #
173 # w_i = \dfrac{1}{\sigma_i^2}
174 # \: , \qquad
175 # \sigma_{\bar{d}^*}^2 = \dfrac{1}{\sum\limits_{i=1}^N w_i}
176 # \: , \qquad
177 # w = \dfrac{1}{\sigma_{\bar{d}^*}^2}
178 #
179 # in which :math:`\sigma_i` are the input data uncertainties in the block and
180 # :math:`\sigma_{\bar{d}^*}` is the propagated uncertainty of the weighted mean in the
181 # block.
182 #
183 # Notice that in this case the output weights reflect the input data uncertainties. Less
184 # weight is given to the data points that had larger uncertainties from the start.
185
186 # Configure BlockMean to assume that the input weights are 1/uncertainty**2
187 mean = vd.BlockMean(spacing=15 / 60, uncertainty=True)
188
189 coordinates, velocity, weights = mean.filter(
190 coordinates=(data.longitude, data.latitude),
191 data=data.velocity_up,
192 weights=data.weights,
193 )
194
195 plot_data(
196 coordinates,
197 velocity,
198 weights,
199 "Weighted mean vertical GPS velocity",
200 "Weights based on data uncertainty",
201 )
202
203 ########################################################################################
204 #
205 # .. note::
206 #
207 # Output weights are always normalized to the ]0, 1] range. See
208 # :func:`verde.variance_to_weights`.
209 #
210 # Interpolation with weights
211 # --------------------------
212 #
213 # The Green's functions based interpolation classes in Verde, like
214 # :class:`~verde.Spline`, can take input weights if you want to give less importance to
215 # some data points. In our case, the points with larger uncertainties shouldn't have the
216 # same influence in our gridded solution as the points with lower uncertainties.
217 #
218 # Let's setup a projection to grid our geographic data using the Cartesian spline
219 # gridder.
220 import pyproj
221
222 projection = pyproj.Proj(proj="merc", lat_ts=data.latitude.mean())
223 proj_coords = projection(data.longitude.values, data.latitude.values)
224
225 region = vd.get_region(coordinates)
226 spacing = 5 / 60
227
228 ########################################################################################
229 # Now we can grid our data using a weighted spline. We'll use the block mean results
230 # with uncertainty based weights.
231 #
232 # Note that the weighted spline solution will only work on a non-exact interpolation. So
233 # we'll need to use some damping regularization or not use the data locations for the
234 # point forces. Here, we'll apply a bit of damping.
235 spline = vd.Chain(
236 [
237 # Convert the spacing to meters because Spline is a Cartesian gridder
238 ("mean", vd.BlockMean(spacing=spacing * 111e3, uncertainty=True)),
239 ("spline", vd.Spline(damping=1e-10)),
240 ]
241 ).fit(proj_coords, data.velocity_up, data.weights)
242 grid = spline.grid(
243 region=region,
244 spacing=spacing,
245 projection=projection,
246 dims=["latitude", "longitude"],
247 data_names=["velocity"],
248 )
249
250 ########################################################################################
251 # Calculate an unweighted spline as well for comparison.
252 spline_unweighted = vd.Chain(
253 [
254 ("mean", vd.BlockReduce(np.mean, spacing=spacing * 111e3)),
255 ("spline", vd.Spline()),
256 ]
257 ).fit(proj_coords, data.velocity_up)
258 grid_unweighted = spline_unweighted.grid(
259 region=region,
260 spacing=spacing,
261 projection=projection,
262 dims=["latitude", "longitude"],
263 data_names=["velocity"],
264 )
265
266 ########################################################################################
267 # Finally, plot the weighted and unweighted grids side by side.
268 fig, axes = plt.subplots(
269 1, 2, figsize=(9.5, 7), subplot_kw=dict(projection=ccrs.Mercator())
270 )
271 crs = ccrs.PlateCarree()
272 ax = axes[0]
273 ax.set_title("Spline interpolation with weights")
274 maxabs = vd.maxabs(data.velocity_up)
275 pc = grid.velocity.plot.pcolormesh(
276 ax=ax,
277 cmap="seismic",
278 vmin=-maxabs,
279 vmax=maxabs,
280 transform=crs,
281 add_colorbar=False,
282 add_labels=False,
283 )
284 plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05).set_label("m/yr")
285 ax.plot(data.longitude, data.latitude, ".k", markersize=0.1, transform=crs)
286 ax.coastlines()
287 vd.datasets.setup_california_gps_map(ax)
288 ax = axes[1]
289 ax.set_title("Spline interpolation without weights")
290 pc = grid_unweighted.velocity.plot.pcolormesh(
291 ax=ax,
292 cmap="seismic",
293 vmin=-maxabs,
294 vmax=maxabs,
295 transform=crs,
296 add_colorbar=False,
297 add_labels=False,
298 )
299 plt.colorbar(pc, ax=ax, orientation="horizontal", pad=0.05).set_label("m/yr")
300 ax.plot(data.longitude, data.latitude, ".k", markersize=0.1, transform=crs)
301 ax.coastlines()
302 vd.datasets.setup_california_gps_map(ax)
303 plt.tight_layout()
304 plt.show()
305
[end of tutorials/weights.py]
[start of verde/__init__.py]
1 # pylint: disable=missing-docstring,import-outside-toplevel
2 # Import functions/classes to make the public API
3 from . import datasets
4 from . import version
5 from .coordinates import (
6 scatter_points,
7 grid_coordinates,
8 inside,
9 block_split,
10 rolling_window,
11 expanding_window,
12 profile_coordinates,
13 get_region,
14 pad_region,
15 project_region,
16 longitude_continuity,
17 )
18 from .mask import distance_mask
19 from .utils import variance_to_weights, maxabs, grid_to_table
20 from .io import load_surfer
21 from .distances import median_distance
22 from .blockreduce import BlockReduce, BlockMean
23 from .scipygridder import ScipyGridder
24 from .trend import Trend
25 from .chain import Chain
26 from .spline import Spline, SplineCV
27 from .model_selection import cross_val_score, train_test_split
28 from .vector import Vector, VectorSpline2D
29
30
31 def test(doctest=True, verbose=True, coverage=False, figures=True):
32 """
33 Run the test suite.
34
35 Uses `py.test <http://pytest.org/>`__ to discover and run the tests.
36
37 Parameters
38 ----------
39
40 doctest : bool
41 If ``True``, will run the doctests as well (code examples that start
42 with a ``>>>`` in the docs).
43 verbose : bool
44 If ``True``, will print extra information during the test run.
45 coverage : bool
46 If ``True``, will run test coverage analysis on the code as well.
47 Requires ``pytest-cov``.
48 figures : bool
49 If ``True``, will test generated figures against saved baseline
50 figures. Requires ``pytest-mpl`` and ``matplotlib``.
51
52 Raises
53 ------
54
55 AssertionError
56 If pytest returns a non-zero error code indicating that some tests have
57 failed.
58
59 """
60 import pytest
61
62 package = __name__
63 args = []
64 if verbose:
65 args.append("-vv")
66 if coverage:
67 args.append("--cov={}".format(package))
68 args.append("--cov-report=term-missing")
69 if doctest:
70 args.append("--doctest-modules")
71 if figures:
72 args.append("--mpl")
73 args.append("--pyargs")
74 args.append(package)
75 status = pytest.main(args)
76 assert status == 0, "Some tests have failed."
77
[end of verde/__init__.py]
[start of verde/mask.py]
1 """
2 Mask grid points based on different criteria.
3 """
4 import numpy as np
5
6 from .base import n_1d_arrays
7 from .utils import kdtree
8
9
10 def distance_mask(
11 data_coordinates, maxdist, coordinates=None, grid=None, projection=None
12 ):
13 """
14 Mask grid points that are too far from the given data points.
15
16 Distances are Euclidean norms. If using geographic data, provide a
17 projection function to convert coordinates to Cartesian before distance
18 calculations.
19
20 Either *coordinates* or *grid* must be given:
21
22 * If *coordinates* is not None, produces an array that is False when a
23 point is more than *maxdist* from the closest data point and True
24 otherwise.
25 * If *grid* is not None, produces a mask and applies it to *grid* (an
26 :class:`xarray.Dataset`).
27
28 .. note::
29
30 If installed, package ``pykdtree`` will be used instead of
31 :class:`scipy.spatial.cKDTree` for better performance.
32
33
34 Parameters
35 ----------
36 data_coordinates : tuple of arrays
37 Same as *coordinates* but for the data points.
38 maxdist : float
39 The maximum distance that a point can be from the closest data point.
40 coordinates : None or tuple of arrays
41 Arrays with the coordinates of each point that will be masked. Should
42 be in the following order: (easting, northing, ...). Only easting and
43 northing will be used, all subsequent coordinates will be ignored.
44 grid : None or :class:`xarray.Dataset`
45 2D grid with values to be masked. Will use the first two dimensions of
46 the grid as northing and easting coordinates, respectively. The mask
47 will be applied to *grid* using the :meth:`xarray.Dataset.where`
48 method.
49 projection : callable or None
50 If not None, then should be a callable object ``projection(easting,
51 northing) -> (proj_easting, proj_northing)`` that takes in easting and
52 northing coordinate arrays and returns projected easting and northing
53 coordinate arrays. This function will be used to project the given
54 coordinates (or the ones extracted from the grid) before calculating
55 distances.
56
57 Returns
58 -------
59 mask : array or :class:`xarray.Dataset`
60 If *coordinates* was given, then a boolean array with the same shape as
61 the elements of *coordinates*. If *grid* was given, then an
62 :class:`xarray.Dataset` with the mask applied to it.
63
64 Examples
65 --------
66
67 >>> from verde import grid_coordinates
68 >>> region = (0, 5, -10, -4)
69 >>> spacing = 1
70 >>> coords = grid_coordinates(region, spacing=spacing)
71 >>> mask = distance_mask((2.5, -7.5), maxdist=2, coordinates=coords)
72 >>> print(mask)
73 [[False False False False False False]
74 [False False True True False False]
75 [False True True True True False]
76 [False True True True True False]
77 [False False True True False False]
78 [False False False False False False]
79 [False False False False False False]]
80 >>> # Mask an xarray.Dataset directly
81 >>> import xarray as xr
82 >>> coords_dict = {"easting": coords[0][0, :], "northing": coords[1][:, 0]}
83 >>> data_vars = {"scalars": (["northing", "easting"], np.ones(mask.shape))}
84 >>> grid = xr.Dataset(data_vars, coords=coords_dict)
85 >>> masked = distance_mask((3.5, -7.5), maxdist=2, grid=grid)
86 >>> print(masked.scalars.values)
87 [[nan nan nan nan nan nan]
88 [nan nan nan 1. 1. nan]
89 [nan nan 1. 1. 1. 1.]
90 [nan nan 1. 1. 1. 1.]
91 [nan nan nan 1. 1. nan]
92 [nan nan nan nan nan nan]
93 [nan nan nan nan nan nan]]
94
95 """
96 if coordinates is None and grid is None:
97 raise ValueError("Either coordinates or grid must be given.")
98 if coordinates is None:
99 dims = [grid[var].dims for var in grid.data_vars][0]
100 coordinates = np.meshgrid(grid.coords[dims[1]], grid.coords[dims[0]])
101 if len(set(i.shape for i in coordinates)) != 1:
102 raise ValueError("Coordinate arrays must have the same shape.")
103 shape = coordinates[0].shape
104 if projection is not None:
105 data_coordinates = projection(*n_1d_arrays(data_coordinates, 2))
106 coordinates = projection(*n_1d_arrays(coordinates, 2))
107 tree = kdtree(data_coordinates[:2])
108 distance = tree.query(np.transpose(n_1d_arrays(coordinates, 2)))[0].reshape(shape)
109 mask = distance <= maxdist
110 if grid is not None:
111 return grid.where(mask)
112 return mask
113
[end of verde/mask.py]
</code>
I need you to solve this issue by generating a single patch file that I can apply directly to this repository using git apply. Please respond with a single patch file in the following format.
<patch>
--- a/file.py
+++ b/file.py
@@ -1,27 +1,35 @@
def euclidean(a, b):
- while b:
- a, b = b, a % b
- return a
+ if b == 0:
+ return a
+ return euclidean(b, a % b)
def bresenham(x0, y0, x1, y1):
points = []
dx = abs(x1 - x0)
dy = abs(y1 - y0)
- sx = 1 if x0 < x1 else -1
- sy = 1 if y0 < y1 else -1
- err = dx - dy
+ x, y = x0, y0
+ sx = -1 if x0 > x1 else 1
+ sy = -1 if y0 > y1 else 1
- while True:
- points.append((x0, y0))
- if x0 == x1 and y0 == y1:
- break
- e2 = 2 * err
- if e2 > -dy:
+ if dx > dy:
+ err = dx / 2.0
+ while x != x1:
+ points.append((x, y))
err -= dy
- x0 += sx
- if e2 < dx:
- err += dx
- y0 += sy
+ if err < 0:
+ y += sy
+ err += dx
+ x += sx
+ else:
+ err = dy / 2.0
+ while y != y1:
+ points.append((x, y))
+ err -= dx
+ if err < 0:
+ x += sx
+ err += dy
+ y += sy
+ points.append((x, y))
return points
</patch>
| fatiando/verde | 072e9df774fb63b0362bd6c95b3e45b7a1dc240c | Create a convexhull masking function
**Description of the desired feature**
A good way to mask grid points that are too far from data is to only show the ones that fall inside the data convex hull. This is what many scipy and matplotlib interpolations do. It would be great to have a `convexhull_mask` function that has a similar interface to `distance_mask` but masks points outside of the convex hull. This function should take the same arguments as `distance_mask` except `maxdist`.
One way of implementing this would be with the `scipy.spatial.Delaunay` class:
```
tri = Delaunay(np.transpose(data_coordinates))
# Find which triangle each grid point is in. -1 indicates that it's not in any.
in_triangle = tri.find_simplex(np.transpose(coordinates))
mask = in_triangle > 0
``` | 2020-03-11 16:45:40+00:00 | <patch>
diff --git a/doc/api/index.rst b/doc/api/index.rst
index 71a1b9c..bcc49d8 100644
--- a/doc/api/index.rst
+++ b/doc/api/index.rst
@@ -63,6 +63,15 @@ Coordinate Manipulation
rolling_window
expanding_window
+Masking
+-------
+
+.. autosummary::
+ :toctree: generated/
+
+ distance_mask
+ convexhull_mask
+
Utilities
---------
@@ -71,7 +80,6 @@ Utilities
test
maxabs
- distance_mask
variance_to_weights
grid_to_table
median_distance
diff --git a/examples/convex_hull_mask.py b/examples/convex_hull_mask.py
new file mode 100644
index 0000000..438e558
--- /dev/null
+++ b/examples/convex_hull_mask.py
@@ -0,0 +1,51 @@
+"""
+Mask grid points by convex hull
+===============================
+
+Sometimes, data points are unevenly distributed. In such cases, we might not
+want to have interpolated grid points that are too far from any data point.
+Function :func:`verde.convexhull_mask` allows us to set grid points that fall
+outside of the convex hull of the data points to NaN or some other value.
+"""
+import matplotlib.pyplot as plt
+import cartopy.crs as ccrs
+import pyproj
+import numpy as np
+import verde as vd
+
+# The Baja California bathymetry dataset has big gaps on land. We want to mask
+# these gaps on a dummy grid that we'll generate over the region just to show
+# what that looks like.
+data = vd.datasets.fetch_baja_bathymetry()
+region = vd.get_region((data.longitude, data.latitude))
+
+# Generate the coordinates for a regular grid mask
+spacing = 10 / 60
+coordinates = vd.grid_coordinates(region, spacing=spacing)
+
+# Generate a mask for points. The mask is True for points that are within the
+# convex hull. We can provide a projection function to convert the coordinates
+# before the convex hull is calculated (Mercator in this case).
+mask = vd.convexhull_mask(
+ data_coordinates=(data.longitude, data.latitude),
+ coordinates=coordinates,
+ projection=pyproj.Proj(proj="merc", lat_ts=data.latitude.mean()),
+)
+print(mask)
+
+# Create a dummy grid with ones that we can mask to show the results. Turn
+# points that are outside of the convex hull into NaNs so they won't show up in
+# our plot.
+dummy_data = np.ones_like(coordinates[0])
+dummy_data[~mask] = np.nan
+
+# Make a plot of the masked data and the data locations.
+crs = ccrs.PlateCarree()
+plt.figure(figsize=(7, 6))
+ax = plt.axes(projection=ccrs.Mercator())
+ax.set_title("Only keep grid points that inside of the convex hull")
+ax.plot(data.longitude, data.latitude, ".y", markersize=0.5, transform=crs)
+ax.pcolormesh(*coordinates, dummy_data, transform=crs)
+vd.datasets.setup_baja_bathymetry_map(ax, land=None)
+plt.tight_layout()
+plt.show()
diff --git a/examples/spline_weights.py b/examples/spline_weights.py
index 73de4fd..171fa83 100644
--- a/examples/spline_weights.py
+++ b/examples/spline_weights.py
@@ -62,11 +62,8 @@ grid_full = chain.grid(
dims=["latitude", "longitude"],
data_names=["velocity"],
)
-grid = vd.distance_mask(
- (data.longitude, data.latitude),
- maxdist=5 * spacing * 111e3,
- grid=grid_full,
- projection=projection,
+grid = vd.convexhull_mask(
+ (data.longitude, data.latitude), grid=grid_full, projection=projection
)
fig, axes = plt.subplots(
diff --git a/examples/vector_uncoupled.py b/examples/vector_uncoupled.py
index acc6db1..6f64194 100644
--- a/examples/vector_uncoupled.py
+++ b/examples/vector_uncoupled.py
@@ -61,13 +61,11 @@ score = chain.score(*test)
print("Cross-validation R^2 score: {:.2f}".format(score))
# Interpolate the wind speed onto a regular geographic grid and mask the data that are
-# far from the observation points
+# outside of the convex hull of the data points.
grid_full = chain.grid(
region, spacing=spacing, projection=projection, dims=["latitude", "longitude"]
)
-grid = vd.distance_mask(
- coordinates, maxdist=3 * spacing * 111e3, grid=grid_full, projection=projection
-)
+grid = vd.convexhull_mask(coordinates, grid=grid_full, projection=projection)
# Make maps of the original and gridded wind speed
plt.figure(figsize=(6, 6))
diff --git a/tutorials/weights.py b/tutorials/weights.py
index 099f1f2..74f5c5c 100644
--- a/tutorials/weights.py
+++ b/tutorials/weights.py
@@ -246,6 +246,8 @@ grid = spline.grid(
dims=["latitude", "longitude"],
data_names=["velocity"],
)
+# Avoid showing interpolation outside of the convex hull of the data points.
+grid = vd.convexhull_mask(coordinates, grid=grid, projection=projection)
########################################################################################
# Calculate an unweighted spline as well for comparison.
@@ -262,6 +264,9 @@ grid_unweighted = spline_unweighted.grid(
dims=["latitude", "longitude"],
data_names=["velocity"],
)
+grid_unweighted = vd.convexhull_mask(
+ coordinates, grid=grid_unweighted, projection=projection
+)
########################################################################################
# Finally, plot the weighted and unweighted grids side by side.
diff --git a/verde/__init__.py b/verde/__init__.py
index b7df9b7..a77e65c 100644
--- a/verde/__init__.py
+++ b/verde/__init__.py
@@ -15,7 +15,7 @@ from .coordinates import (
project_region,
longitude_continuity,
)
-from .mask import distance_mask
+from .mask import distance_mask, convexhull_mask
from .utils import variance_to_weights, maxabs, grid_to_table
from .io import load_surfer
from .distances import median_distance
diff --git a/verde/mask.py b/verde/mask.py
index 9c06ca2..6d92cb4 100644
--- a/verde/mask.py
+++ b/verde/mask.py
@@ -3,7 +3,10 @@ Mask grid points based on different criteria.
"""
import numpy as np
-from .base import n_1d_arrays
+# pylint doesn't pick up on this import for some reason
+from scipy.spatial import Delaunay # pylint: disable=no-name-in-module
+
+from .base.utils import n_1d_arrays, check_coordinates
from .utils import kdtree
@@ -43,9 +46,10 @@ def distance_mask(
northing will be used, all subsequent coordinates will be ignored.
grid : None or :class:`xarray.Dataset`
2D grid with values to be masked. Will use the first two dimensions of
- the grid as northing and easting coordinates, respectively. The mask
- will be applied to *grid* using the :meth:`xarray.Dataset.where`
- method.
+ the grid as northing and easting coordinates, respectively. For this to
+ work, the grid dimensions **must be ordered as northing then easting**.
+ The mask will be applied to *grid* using the
+ :meth:`xarray.Dataset.where` method.
projection : callable or None
If not None, then should be a callable object ``projection(easting,
northing) -> (proj_easting, proj_northing)`` that takes in easting and
@@ -93,14 +97,7 @@ def distance_mask(
[nan nan nan nan nan nan]]
"""
- if coordinates is None and grid is None:
- raise ValueError("Either coordinates or grid must be given.")
- if coordinates is None:
- dims = [grid[var].dims for var in grid.data_vars][0]
- coordinates = np.meshgrid(grid.coords[dims[1]], grid.coords[dims[0]])
- if len(set(i.shape for i in coordinates)) != 1:
- raise ValueError("Coordinate arrays must have the same shape.")
- shape = coordinates[0].shape
+ coordinates, shape = _get_grid_coordinates(coordinates, grid)
if projection is not None:
data_coordinates = projection(*n_1d_arrays(data_coordinates, 2))
coordinates = projection(*n_1d_arrays(coordinates, 2))
@@ -110,3 +107,121 @@ def distance_mask(
if grid is not None:
return grid.where(mask)
return mask
+
+
+def convexhull_mask(
+ data_coordinates, coordinates=None, grid=None, projection=None,
+):
+ """
+ Mask grid points that are outside the convex hull of the given data points.
+
+ Either *coordinates* or *grid* must be given:
+
+ * If *coordinates* is not None, produces an array that is False when a
+ point is outside the convex hull and True otherwise.
+ * If *grid* is not None, produces a mask and applies it to *grid* (an
+ :class:`xarray.Dataset`).
+
+ Parameters
+ ----------
+ data_coordinates : tuple of arrays
+ Same as *coordinates* but for the data points.
+ coordinates : None or tuple of arrays
+ Arrays with the coordinates of each point that will be masked. Should
+ be in the following order: (easting, northing, ...). Only easting and
+ northing will be used, all subsequent coordinates will be ignored.
+ grid : None or :class:`xarray.Dataset`
+ 2D grid with values to be masked. Will use the first two dimensions of
+ the grid as northing and easting coordinates, respectively. For this to
+ work, the grid dimensions **must be ordered as northing then easting**.
+ The mask will be applied to *grid* using the
+ :meth:`xarray.Dataset.where` method.
+ projection : callable or None
+ If not None, then should be a callable object ``projection(easting,
+ northing) -> (proj_easting, proj_northing)`` that takes in easting and
+ northing coordinate arrays and returns projected easting and northing
+ coordinate arrays. This function will be used to project the given
+ coordinates (or the ones extracted from the grid) before calculating
+ distances.
+
+ Returns
+ -------
+ mask : array or :class:`xarray.Dataset`
+ If *coordinates* was given, then a boolean array with the same shape as
+ the elements of *coordinates*. If *grid* was given, then an
+ :class:`xarray.Dataset` with the mask applied to it.
+
+ Examples
+ --------
+
+ >>> from verde import grid_coordinates
+ >>> region = (0, 5, -10, -4)
+ >>> spacing = 1
+ >>> coords = grid_coordinates(region, spacing=spacing)
+ >>> data_coords = ((2, 3, 2, 3), (-9, -9, -6, -6))
+ >>> mask = convexhull_mask(data_coords, coordinates=coords)
+ >>> print(mask)
+ [[False False False False False False]
+ [False False True True False False]
+ [False False True True False False]
+ [False False True True False False]
+ [False False True True False False]
+ [False False False False False False]
+ [False False False False False False]]
+ >>> # Mask an xarray.Dataset directly
+ >>> import xarray as xr
+ >>> coords_dict = {"easting": coords[0][0, :], "northing": coords[1][:, 0]}
+ >>> data_vars = {"scalars": (["northing", "easting"], np.ones(mask.shape))}
+ >>> grid = xr.Dataset(data_vars, coords=coords_dict)
+ >>> masked = convexhull_mask(data_coords, grid=grid)
+ >>> print(masked.scalars.values)
+ [[nan nan nan nan nan nan]
+ [nan nan 1. 1. nan nan]
+ [nan nan 1. 1. nan nan]
+ [nan nan 1. 1. nan nan]
+ [nan nan 1. 1. nan nan]
+ [nan nan nan nan nan nan]
+ [nan nan nan nan nan nan]]
+
+ """
+ coordinates, shape = _get_grid_coordinates(coordinates, grid)
+ n_coordinates = 2
+ # Make sure they are arrays so we can normalize
+ data_coordinates = n_1d_arrays(data_coordinates, n_coordinates)
+ coordinates = n_1d_arrays(coordinates, n_coordinates)
+ if projection is not None:
+ data_coordinates = projection(*data_coordinates)
+ coordinates = projection(*coordinates)
+ # Normalize the coordinates to avoid errors from qhull when values are very
+ # large (as occurs when projections are used).
+ means = [coord.mean() for coord in data_coordinates]
+ stds = [coord.std() for coord in data_coordinates]
+ data_coordinates = tuple(
+ (coord - mean) / std for coord, mean, std in zip(data_coordinates, means, stds)
+ )
+ coordinates = tuple(
+ (coord - mean) / std for coord, mean, std in zip(coordinates, means, stds)
+ )
+ triangles = Delaunay(np.transpose(data_coordinates))
+ # Find the triangle that contains each grid point.
+ # -1 indicates that it's not in any triangle.
+ in_triangle = triangles.find_simplex(np.transpose(coordinates))
+ mask = (in_triangle != -1).reshape(shape)
+ if grid is not None:
+ return grid.where(mask)
+ return mask
+
+
+def _get_grid_coordinates(coordinates, grid):
+ """
+ If coordinates is given, return it and their shape. Otherwise, get
+ coordinate arrays from the grid.
+ """
+ if coordinates is None and grid is None:
+ raise ValueError("Either coordinates or grid must be given.")
+ if coordinates is None:
+ dims = [grid[var].dims for var in grid.data_vars][0]
+ coordinates = np.meshgrid(grid.coords[dims[1]], grid.coords[dims[0]])
+ check_coordinates(coordinates)
+ shape = coordinates[0].shape
+ return coordinates, shape
</patch> | diff --git a/verde/tests/test_mask.py b/verde/tests/test_mask.py
index cfb80ed..5a7bd25 100644
--- a/verde/tests/test_mask.py
+++ b/verde/tests/test_mask.py
@@ -6,10 +6,50 @@ import numpy.testing as npt
import xarray as xr
import pytest
-from ..mask import distance_mask
+from ..mask import distance_mask, convexhull_mask
from ..coordinates import grid_coordinates
+def test_convexhull_mask():
+ "Check that the mask works for basic input"
+ region = (0, 5, -10, -4)
+ coords = grid_coordinates(region, spacing=1)
+ data_coords = ((2, 3, 2, 3), (-9, -9, -6, -6))
+ mask = convexhull_mask(data_coords, coordinates=coords)
+ true = [
+ [False, False, False, False, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, False, False, False, False],
+ [False, False, False, False, False, False],
+ ]
+ assert mask.tolist() == true
+
+
+def test_convexhull_mask_projection():
+ "Check that the mask works when given a projection"
+ region = (0, 5, -10, -4)
+ coords = grid_coordinates(region, spacing=1)
+ data_coords = ((2, 3, 2, 3), (-9, -9, -6, -6))
+ # For a linear projection, the result should be the same since there is no
+ # area change in the data.
+ mask = convexhull_mask(
+ data_coords, coordinates=coords, projection=lambda e, n: (10 * e, 10 * n),
+ )
+ true = [
+ [False, False, False, False, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, True, True, False, False],
+ [False, False, False, False, False, False],
+ [False, False, False, False, False, False],
+ ]
+ assert mask.tolist() == true
+
+
def test_distance_mask():
"Check that the mask works for basic input"
region = (0, 5, -10, -4)
| 0.0 | [
"verde/tests/test_mask.py::test_convexhull_mask",
"verde/tests/test_mask.py::test_convexhull_mask_projection",
"verde/tests/test_mask.py::test_distance_mask",
"verde/tests/test_mask.py::test_distance_mask_projection",
"verde/tests/test_mask.py::test_distance_mask_grid",
"verde/tests/test_mask.py::test_distance_mask_missing_args",
"verde/tests/test_mask.py::test_distance_mask_wrong_shapes"
] | [] | 072e9df774fb63b0362bd6c95b3e45b7a1dc240c |
|
pydantic__pydantic-4012 | You will be provided with a partial code base and an issue statement explaining a problem to resolve.
<issue>
Model signatures contain reserved words.
### Checks
* [x] I added a descriptive title to this issue
* [x] I have searched (google, github) for similar issues and couldn't find anything
* [x] I have read and followed [the docs](https://pydantic-docs.helpmanual.io/) and still think this is a bug
# Bug
When creating a model with a field that has an alias that is a python reserved word, the signature uses the reserved word even when `allow_population_by_field_name` is true.
I expect the signature to use the field name in this case. I came across this issue when generating tests using hypothesis. This uses the model signature and was creating invalid test code as it was using the reserved words in the signature.
```
pydantic version: 1.9.0
pydantic compiled: False
python version: 3.9.0 (tags/v3.9.0:9cf6752, Oct 5 2020, 15:34:40) [MSC v.1927 64 bit (AMD64)]
platform: Windows-10-10.0.19041-SP0
optional deps. installed: ['devtools', 'dotenv', 'email-validator', 'typing-extensions']
```
```py
from pydantic import BaseModel, Field
from inspect import signature
class Foo(BaseModel):
from_: str = Field(..., alias='from')
class Config:
allow_population_by_field_name = True
str(signature(Foo))
```
> '(*, from: str) -> None'
# Suggestion
In `utils.generate_model_signature()` we have these two checks.
```py
elif not param_name.isidentifier():
if allow_names and field_name.isidentifier():
```
I propose replacing these with calls to the following (new) function.
```py
def is_valid_identifier(identifier: str) -> bool:
"""
Checks that a string is a valid identifier and not a reserved word.
:param identifier: The identifier to test.
:return: True if the identifier is valid.
"""
return identifier.isidentifier() and not keyword.iskeyword(identifier)
```
I believe that this behaviour is closer to what a user would expect in the common case that they are generating models from a schema containing python reserved words.
</issue>
<code>
[start of README.md]
1 # pydantic
2
3 [](https://github.com/samuelcolvin/pydantic/actions?query=event%3Apush+branch%3Amaster+workflow%3ACI)
4 [](https://coverage-badge.samuelcolvin.workers.dev/redirect/samuelcolvin/pydantic)
5 [](https://pypi.python.org/pypi/pydantic)
6 [](https://anaconda.org/conda-forge/pydantic)
7 [](https://pepy.tech/project/pydantic)
8 [](https://github.com/samuelcolvin/pydantic)
9 [](https://github.com/samuelcolvin/pydantic/blob/master/LICENSE)
10
11 Data validation and settings management using Python type hints.
12
13 Fast and extensible, *pydantic* plays nicely with your linters/IDE/brain.
14 Define how data should be in pure, canonical Python 3.7+; validate it with *pydantic*.
15
16 ## Help
17
18 See [documentation](https://pydantic-docs.helpmanual.io/) for more details.
19
20 ## Installation
21
22 Install using `pip install -U pydantic` or `conda install pydantic -c conda-forge`.
23 For more installation options to make *pydantic* even faster,
24 see the [Install](https://pydantic-docs.helpmanual.io/install/) section in the documentation.
25
26 ## A Simple Example
27
28 ```py
29 from datetime import datetime
30 from typing import List, Optional
31 from pydantic import BaseModel
32
33 class User(BaseModel):
34 id: int
35 name = 'John Doe'
36 signup_ts: Optional[datetime] = None
37 friends: List[int] = []
38
39 external_data = {'id': '123', 'signup_ts': '2017-06-01 12:22', 'friends': [1, '2', b'3']}
40 user = User(**external_data)
41 print(user)
42 #> User id=123 name='John Doe' signup_ts=datetime.datetime(2017, 6, 1, 12, 22) friends=[1, 2, 3]
43 print(user.id)
44 #> 123
45 ```
46
47 ## Contributing
48
49 For guidance on setting up a development environment and how to make a
50 contribution to *pydantic*, see
51 [Contributing to Pydantic](https://pydantic-docs.helpmanual.io/contributing/).
52
53 ## Reporting a Security Vulnerability
54
55 See our [security policy](https://github.com/samuelcolvin/pydantic/security/policy).
56
[end of README.md]
[start of pydantic/utils.py]
1 import warnings
2 import weakref
3 from collections import OrderedDict, defaultdict, deque
4 from copy import deepcopy
5 from itertools import islice, zip_longest
6 from types import BuiltinFunctionType, CodeType, FunctionType, GeneratorType, LambdaType, ModuleType
7 from typing import (
8 TYPE_CHECKING,
9 AbstractSet,
10 Any,
11 Callable,
12 Collection,
13 Dict,
14 Generator,
15 Iterable,
16 Iterator,
17 List,
18 Mapping,
19 Optional,
20 Set,
21 Tuple,
22 Type,
23 TypeVar,
24 Union,
25 )
26
27 from typing_extensions import Annotated
28
29 from .errors import ConfigError
30 from .typing import (
31 NoneType,
32 WithArgsTypes,
33 all_literal_values,
34 display_as_type,
35 get_args,
36 get_origin,
37 is_literal_type,
38 is_union,
39 )
40 from .version import version_info
41
42 if TYPE_CHECKING:
43 from inspect import Signature
44 from pathlib import Path
45
46 from .config import BaseConfig
47 from .dataclasses import Dataclass
48 from .fields import ModelField
49 from .main import BaseModel
50 from .typing import AbstractSetIntStr, DictIntStrAny, IntStr, MappingIntStrAny, ReprArgs
51
52 __all__ = (
53 'import_string',
54 'sequence_like',
55 'validate_field_name',
56 'lenient_isinstance',
57 'lenient_issubclass',
58 'in_ipython',
59 'deep_update',
60 'update_not_none',
61 'almost_equal_floats',
62 'get_model',
63 'to_camel',
64 'is_valid_field',
65 'smart_deepcopy',
66 'PyObjectStr',
67 'Representation',
68 'GetterDict',
69 'ValueItems',
70 'version_info', # required here to match behaviour in v1.3
71 'ClassAttribute',
72 'path_type',
73 'ROOT_KEY',
74 'get_unique_discriminator_alias',
75 'get_discriminator_alias_and_values',
76 )
77
78 ROOT_KEY = '__root__'
79 # these are types that are returned unchanged by deepcopy
80 IMMUTABLE_NON_COLLECTIONS_TYPES: Set[Type[Any]] = {
81 int,
82 float,
83 complex,
84 str,
85 bool,
86 bytes,
87 type,
88 NoneType,
89 FunctionType,
90 BuiltinFunctionType,
91 LambdaType,
92 weakref.ref,
93 CodeType,
94 # note: including ModuleType will differ from behaviour of deepcopy by not producing error.
95 # It might be not a good idea in general, but considering that this function used only internally
96 # against default values of fields, this will allow to actually have a field with module as default value
97 ModuleType,
98 NotImplemented.__class__,
99 Ellipsis.__class__,
100 }
101
102 # these are types that if empty, might be copied with simple copy() instead of deepcopy()
103 BUILTIN_COLLECTIONS: Set[Type[Any]] = {
104 list,
105 set,
106 tuple,
107 frozenset,
108 dict,
109 OrderedDict,
110 defaultdict,
111 deque,
112 }
113
114
115 def import_string(dotted_path: str) -> Any:
116 """
117 Stolen approximately from django. Import a dotted module path and return the attribute/class designated by the
118 last name in the path. Raise ImportError if the import fails.
119 """
120 from importlib import import_module
121
122 try:
123 module_path, class_name = dotted_path.strip(' ').rsplit('.', 1)
124 except ValueError as e:
125 raise ImportError(f'"{dotted_path}" doesn\'t look like a module path') from e
126
127 module = import_module(module_path)
128 try:
129 return getattr(module, class_name)
130 except AttributeError as e:
131 raise ImportError(f'Module "{module_path}" does not define a "{class_name}" attribute') from e
132
133
134 def truncate(v: Union[str], *, max_len: int = 80) -> str:
135 """
136 Truncate a value and add a unicode ellipsis (three dots) to the end if it was too long
137 """
138 warnings.warn('`truncate` is no-longer used by pydantic and is deprecated', DeprecationWarning)
139 if isinstance(v, str) and len(v) > (max_len - 2):
140 # -3 so quote + string + … + quote has correct length
141 return (v[: (max_len - 3)] + '…').__repr__()
142 try:
143 v = v.__repr__()
144 except TypeError:
145 v = v.__class__.__repr__(v) # in case v is a type
146 if len(v) > max_len:
147 v = v[: max_len - 1] + '…'
148 return v
149
150
151 def sequence_like(v: Any) -> bool:
152 return isinstance(v, (list, tuple, set, frozenset, GeneratorType, deque))
153
154
155 def validate_field_name(bases: List[Type['BaseModel']], field_name: str) -> None:
156 """
157 Ensure that the field's name does not shadow an existing attribute of the model.
158 """
159 for base in bases:
160 if getattr(base, field_name, None):
161 raise NameError(
162 f'Field name "{field_name}" shadows a BaseModel attribute; '
163 f'use a different field name with "alias=\'{field_name}\'".'
164 )
165
166
167 def lenient_isinstance(o: Any, class_or_tuple: Union[Type[Any], Tuple[Type[Any], ...], None]) -> bool:
168 try:
169 return isinstance(o, class_or_tuple) # type: ignore[arg-type]
170 except TypeError:
171 return False
172
173
174 def lenient_issubclass(cls: Any, class_or_tuple: Union[Type[Any], Tuple[Type[Any], ...], None]) -> bool:
175 try:
176 return isinstance(cls, type) and issubclass(cls, class_or_tuple) # type: ignore[arg-type]
177 except TypeError:
178 if isinstance(cls, WithArgsTypes):
179 return False
180 raise # pragma: no cover
181
182
183 def in_ipython() -> bool:
184 """
185 Check whether we're in an ipython environment, including jupyter notebooks.
186 """
187 try:
188 eval('__IPYTHON__')
189 except NameError:
190 return False
191 else: # pragma: no cover
192 return True
193
194
195 KeyType = TypeVar('KeyType')
196
197
198 def deep_update(mapping: Dict[KeyType, Any], *updating_mappings: Dict[KeyType, Any]) -> Dict[KeyType, Any]:
199 updated_mapping = mapping.copy()
200 for updating_mapping in updating_mappings:
201 for k, v in updating_mapping.items():
202 if k in updated_mapping and isinstance(updated_mapping[k], dict) and isinstance(v, dict):
203 updated_mapping[k] = deep_update(updated_mapping[k], v)
204 else:
205 updated_mapping[k] = v
206 return updated_mapping
207
208
209 def update_not_none(mapping: Dict[Any, Any], **update: Any) -> None:
210 mapping.update({k: v for k, v in update.items() if v is not None})
211
212
213 def almost_equal_floats(value_1: float, value_2: float, *, delta: float = 1e-8) -> bool:
214 """
215 Return True if two floats are almost equal
216 """
217 return abs(value_1 - value_2) <= delta
218
219
220 def generate_model_signature(
221 init: Callable[..., None], fields: Dict[str, 'ModelField'], config: Type['BaseConfig']
222 ) -> 'Signature':
223 """
224 Generate signature for model based on its fields
225 """
226 from inspect import Parameter, Signature, signature
227
228 from .config import Extra
229
230 present_params = signature(init).parameters.values()
231 merged_params: Dict[str, Parameter] = {}
232 var_kw = None
233 use_var_kw = False
234
235 for param in islice(present_params, 1, None): # skip self arg
236 if param.kind is param.VAR_KEYWORD:
237 var_kw = param
238 continue
239 merged_params[param.name] = param
240
241 if var_kw: # if custom init has no var_kw, fields which are not declared in it cannot be passed through
242 allow_names = config.allow_population_by_field_name
243 for field_name, field in fields.items():
244 param_name = field.alias
245 if field_name in merged_params or param_name in merged_params:
246 continue
247 elif not param_name.isidentifier():
248 if allow_names and field_name.isidentifier():
249 param_name = field_name
250 else:
251 use_var_kw = True
252 continue
253
254 # TODO: replace annotation with actual expected types once #1055 solved
255 kwargs = {'default': field.default} if not field.required else {}
256 merged_params[param_name] = Parameter(
257 param_name, Parameter.KEYWORD_ONLY, annotation=field.outer_type_, **kwargs
258 )
259
260 if config.extra is Extra.allow:
261 use_var_kw = True
262
263 if var_kw and use_var_kw:
264 # Make sure the parameter for extra kwargs
265 # does not have the same name as a field
266 default_model_signature = [
267 ('__pydantic_self__', Parameter.POSITIONAL_OR_KEYWORD),
268 ('data', Parameter.VAR_KEYWORD),
269 ]
270 if [(p.name, p.kind) for p in present_params] == default_model_signature:
271 # if this is the standard model signature, use extra_data as the extra args name
272 var_kw_name = 'extra_data'
273 else:
274 # else start from var_kw
275 var_kw_name = var_kw.name
276
277 # generate a name that's definitely unique
278 while var_kw_name in fields:
279 var_kw_name += '_'
280 merged_params[var_kw_name] = var_kw.replace(name=var_kw_name)
281
282 return Signature(parameters=list(merged_params.values()), return_annotation=None)
283
284
285 def get_model(obj: Union[Type['BaseModel'], Type['Dataclass']]) -> Type['BaseModel']:
286 from .main import BaseModel
287
288 try:
289 model_cls = obj.__pydantic_model__ # type: ignore
290 except AttributeError:
291 model_cls = obj
292
293 if not issubclass(model_cls, BaseModel):
294 raise TypeError('Unsupported type, must be either BaseModel or dataclass')
295 return model_cls
296
297
298 def to_camel(string: str) -> str:
299 return ''.join(word.capitalize() for word in string.split('_'))
300
301
302 T = TypeVar('T')
303
304
305 def unique_list(
306 input_list: Union[List[T], Tuple[T, ...]],
307 *,
308 name_factory: Callable[[T], str] = str,
309 ) -> List[T]:
310 """
311 Make a list unique while maintaining order.
312 We update the list if another one with the same name is set
313 (e.g. root validator overridden in subclass)
314 """
315 result: List[T] = []
316 result_names: List[str] = []
317 for v in input_list:
318 v_name = name_factory(v)
319 if v_name not in result_names:
320 result_names.append(v_name)
321 result.append(v)
322 else:
323 result[result_names.index(v_name)] = v
324
325 return result
326
327
328 class PyObjectStr(str):
329 """
330 String class where repr doesn't include quotes. Useful with Representation when you want to return a string
331 representation of something that valid (or pseudo-valid) python.
332 """
333
334 def __repr__(self) -> str:
335 return str(self)
336
337
338 class Representation:
339 """
340 Mixin to provide __str__, __repr__, and __pretty__ methods. See #884 for more details.
341
342 __pretty__ is used by [devtools](https://python-devtools.helpmanual.io/) to provide human readable representations
343 of objects.
344 """
345
346 __slots__: Tuple[str, ...] = tuple()
347
348 def __repr_args__(self) -> 'ReprArgs':
349 """
350 Returns the attributes to show in __str__, __repr__, and __pretty__ this is generally overridden.
351
352 Can either return:
353 * name - value pairs, e.g.: `[('foo_name', 'foo'), ('bar_name', ['b', 'a', 'r'])]`
354 * or, just values, e.g.: `[(None, 'foo'), (None, ['b', 'a', 'r'])]`
355 """
356 attrs = ((s, getattr(self, s)) for s in self.__slots__)
357 return [(a, v) for a, v in attrs if v is not None]
358
359 def __repr_name__(self) -> str:
360 """
361 Name of the instance's class, used in __repr__.
362 """
363 return self.__class__.__name__
364
365 def __repr_str__(self, join_str: str) -> str:
366 return join_str.join(repr(v) if a is None else f'{a}={v!r}' for a, v in self.__repr_args__())
367
368 def __pretty__(self, fmt: Callable[[Any], Any], **kwargs: Any) -> Generator[Any, None, None]:
369 """
370 Used by devtools (https://python-devtools.helpmanual.io/) to provide a human readable representations of objects
371 """
372 yield self.__repr_name__() + '('
373 yield 1
374 for name, value in self.__repr_args__():
375 if name is not None:
376 yield name + '='
377 yield fmt(value)
378 yield ','
379 yield 0
380 yield -1
381 yield ')'
382
383 def __str__(self) -> str:
384 return self.__repr_str__(' ')
385
386 def __repr__(self) -> str:
387 return f'{self.__repr_name__()}({self.__repr_str__(", ")})'
388
389
390 class GetterDict(Representation):
391 """
392 Hack to make object's smell just enough like dicts for validate_model.
393
394 We can't inherit from Mapping[str, Any] because it upsets cython so we have to implement all methods ourselves.
395 """
396
397 __slots__ = ('_obj',)
398
399 def __init__(self, obj: Any):
400 self._obj = obj
401
402 def __getitem__(self, key: str) -> Any:
403 try:
404 return getattr(self._obj, key)
405 except AttributeError as e:
406 raise KeyError(key) from e
407
408 def get(self, key: Any, default: Any = None) -> Any:
409 return getattr(self._obj, key, default)
410
411 def extra_keys(self) -> Set[Any]:
412 """
413 We don't want to get any other attributes of obj if the model didn't explicitly ask for them
414 """
415 return set()
416
417 def keys(self) -> List[Any]:
418 """
419 Keys of the pseudo dictionary, uses a list not set so order information can be maintained like python
420 dictionaries.
421 """
422 return list(self)
423
424 def values(self) -> List[Any]:
425 return [self[k] for k in self]
426
427 def items(self) -> Iterator[Tuple[str, Any]]:
428 for k in self:
429 yield k, self.get(k)
430
431 def __iter__(self) -> Iterator[str]:
432 for name in dir(self._obj):
433 if not name.startswith('_'):
434 yield name
435
436 def __len__(self) -> int:
437 return sum(1 for _ in self)
438
439 def __contains__(self, item: Any) -> bool:
440 return item in self.keys()
441
442 def __eq__(self, other: Any) -> bool:
443 return dict(self) == dict(other.items())
444
445 def __repr_args__(self) -> 'ReprArgs':
446 return [(None, dict(self))]
447
448 def __repr_name__(self) -> str:
449 return f'GetterDict[{display_as_type(self._obj)}]'
450
451
452 class ValueItems(Representation):
453 """
454 Class for more convenient calculation of excluded or included fields on values.
455 """
456
457 __slots__ = ('_items', '_type')
458
459 def __init__(self, value: Any, items: Union['AbstractSetIntStr', 'MappingIntStrAny']) -> None:
460 items = self._coerce_items(items)
461
462 if isinstance(value, (list, tuple)):
463 items = self._normalize_indexes(items, len(value))
464
465 self._items: 'MappingIntStrAny' = items
466
467 def is_excluded(self, item: Any) -> bool:
468 """
469 Check if item is fully excluded.
470
471 :param item: key or index of a value
472 """
473 return self.is_true(self._items.get(item))
474
475 def is_included(self, item: Any) -> bool:
476 """
477 Check if value is contained in self._items
478
479 :param item: key or index of value
480 """
481 return item in self._items
482
483 def for_element(self, e: 'IntStr') -> Optional[Union['AbstractSetIntStr', 'MappingIntStrAny']]:
484 """
485 :param e: key or index of element on value
486 :return: raw values for element if self._items is dict and contain needed element
487 """
488
489 item = self._items.get(e)
490 return item if not self.is_true(item) else None
491
492 def _normalize_indexes(self, items: 'MappingIntStrAny', v_length: int) -> 'DictIntStrAny':
493 """
494 :param items: dict or set of indexes which will be normalized
495 :param v_length: length of sequence indexes of which will be
496
497 >>> self._normalize_indexes({0: True, -2: True, -1: True}, 4)
498 {0: True, 2: True, 3: True}
499 >>> self._normalize_indexes({'__all__': True}, 4)
500 {0: True, 1: True, 2: True, 3: True}
501 """
502
503 normalized_items: 'DictIntStrAny' = {}
504 all_items = None
505 for i, v in items.items():
506 if not (isinstance(v, Mapping) or isinstance(v, AbstractSet) or self.is_true(v)):
507 raise TypeError(f'Unexpected type of exclude value for index "{i}" {v.__class__}')
508 if i == '__all__':
509 all_items = self._coerce_value(v)
510 continue
511 if not isinstance(i, int):
512 raise TypeError(
513 'Excluding fields from a sequence of sub-models or dicts must be performed index-wise: '
514 'expected integer keys or keyword "__all__"'
515 )
516 normalized_i = v_length + i if i < 0 else i
517 normalized_items[normalized_i] = self.merge(v, normalized_items.get(normalized_i))
518
519 if not all_items:
520 return normalized_items
521 if self.is_true(all_items):
522 for i in range(v_length):
523 normalized_items.setdefault(i, ...)
524 return normalized_items
525 for i in range(v_length):
526 normalized_item = normalized_items.setdefault(i, {})
527 if not self.is_true(normalized_item):
528 normalized_items[i] = self.merge(all_items, normalized_item)
529 return normalized_items
530
531 @classmethod
532 def merge(cls, base: Any, override: Any, intersect: bool = False) -> Any:
533 """
534 Merge a ``base`` item with an ``override`` item.
535
536 Both ``base`` and ``override`` are converted to dictionaries if possible.
537 Sets are converted to dictionaries with the sets entries as keys and
538 Ellipsis as values.
539
540 Each key-value pair existing in ``base`` is merged with ``override``,
541 while the rest of the key-value pairs are updated recursively with this function.
542
543 Merging takes place based on the "union" of keys if ``intersect`` is
544 set to ``False`` (default) and on the intersection of keys if
545 ``intersect`` is set to ``True``.
546 """
547 override = cls._coerce_value(override)
548 base = cls._coerce_value(base)
549 if override is None:
550 return base
551 if cls.is_true(base) or base is None:
552 return override
553 if cls.is_true(override):
554 return base if intersect else override
555
556 # intersection or union of keys while preserving ordering:
557 if intersect:
558 merge_keys = [k for k in base if k in override] + [k for k in override if k in base]
559 else:
560 merge_keys = list(base) + [k for k in override if k not in base]
561
562 merged: 'DictIntStrAny' = {}
563 for k in merge_keys:
564 merged_item = cls.merge(base.get(k), override.get(k), intersect=intersect)
565 if merged_item is not None:
566 merged[k] = merged_item
567
568 return merged
569
570 @staticmethod
571 def _coerce_items(items: Union['AbstractSetIntStr', 'MappingIntStrAny']) -> 'MappingIntStrAny':
572 if isinstance(items, Mapping):
573 pass
574 elif isinstance(items, AbstractSet):
575 items = dict.fromkeys(items, ...)
576 else:
577 class_name = getattr(items, '__class__', '???')
578 raise TypeError(f'Unexpected type of exclude value {class_name}')
579 return items
580
581 @classmethod
582 def _coerce_value(cls, value: Any) -> Any:
583 if value is None or cls.is_true(value):
584 return value
585 return cls._coerce_items(value)
586
587 @staticmethod
588 def is_true(v: Any) -> bool:
589 return v is True or v is ...
590
591 def __repr_args__(self) -> 'ReprArgs':
592 return [(None, self._items)]
593
594
595 class ClassAttribute:
596 """
597 Hide class attribute from its instances
598 """
599
600 __slots__ = (
601 'name',
602 'value',
603 )
604
605 def __init__(self, name: str, value: Any) -> None:
606 self.name = name
607 self.value = value
608
609 def __get__(self, instance: Any, owner: Type[Any]) -> None:
610 if instance is None:
611 return self.value
612 raise AttributeError(f'{self.name!r} attribute of {owner.__name__!r} is class-only')
613
614
615 path_types = {
616 'is_dir': 'directory',
617 'is_file': 'file',
618 'is_mount': 'mount point',
619 'is_symlink': 'symlink',
620 'is_block_device': 'block device',
621 'is_char_device': 'char device',
622 'is_fifo': 'FIFO',
623 'is_socket': 'socket',
624 }
625
626
627 def path_type(p: 'Path') -> str:
628 """
629 Find out what sort of thing a path is.
630 """
631 assert p.exists(), 'path does not exist'
632 for method, name in path_types.items():
633 if getattr(p, method)():
634 return name
635
636 return 'unknown'
637
638
639 Obj = TypeVar('Obj')
640
641
642 def smart_deepcopy(obj: Obj) -> Obj:
643 """
644 Return type as is for immutable built-in types
645 Use obj.copy() for built-in empty collections
646 Use copy.deepcopy() for non-empty collections and unknown objects
647 """
648
649 obj_type = obj.__class__
650 if obj_type in IMMUTABLE_NON_COLLECTIONS_TYPES:
651 return obj # fastest case: obj is immutable and not collection therefore will not be copied anyway
652 elif not obj and obj_type in BUILTIN_COLLECTIONS:
653 # faster way for empty collections, no need to copy its members
654 return obj if obj_type is tuple else obj.copy() # type: ignore # tuple doesn't have copy method
655 return deepcopy(obj) # slowest way when we actually might need a deepcopy
656
657
658 def is_valid_field(name: str) -> bool:
659 if not name.startswith('_'):
660 return True
661 return ROOT_KEY == name
662
663
664 def is_valid_private_name(name: str) -> bool:
665 return not is_valid_field(name) and name not in {
666 '__annotations__',
667 '__classcell__',
668 '__doc__',
669 '__module__',
670 '__orig_bases__',
671 '__qualname__',
672 }
673
674
675 _EMPTY = object()
676
677
678 def all_identical(left: Iterable[Any], right: Iterable[Any]) -> bool:
679 """
680 Check that the items of `left` are the same objects as those in `right`.
681
682 >>> a, b = object(), object()
683 >>> all_identical([a, b, a], [a, b, a])
684 True
685 >>> all_identical([a, b, [a]], [a, b, [a]]) # new list object, while "equal" is not "identical"
686 False
687 """
688 for left_item, right_item in zip_longest(left, right, fillvalue=_EMPTY):
689 if left_item is not right_item:
690 return False
691 return True
692
693
694 def get_unique_discriminator_alias(all_aliases: Collection[str], discriminator_key: str) -> str:
695 """Validate that all aliases are the same and if that's the case return the alias"""
696 unique_aliases = set(all_aliases)
697 if len(unique_aliases) > 1:
698 raise ConfigError(
699 f'Aliases for discriminator {discriminator_key!r} must be the same (got {", ".join(sorted(all_aliases))})'
700 )
701 return unique_aliases.pop()
702
703
704 def get_discriminator_alias_and_values(tp: Any, discriminator_key: str) -> Tuple[str, Tuple[str, ...]]:
705 """
706 Get alias and all valid values in the `Literal` type of the discriminator field
707 `tp` can be a `BaseModel` class or directly an `Annotated` `Union` of many.
708 """
709 is_root_model = getattr(tp, '__custom_root_type__', False)
710
711 if get_origin(tp) is Annotated:
712 tp = get_args(tp)[0]
713
714 if hasattr(tp, '__pydantic_model__'):
715 tp = tp.__pydantic_model__
716
717 if is_union(get_origin(tp)):
718 alias, all_values = _get_union_alias_and_all_values(tp, discriminator_key)
719 return alias, tuple(v for values in all_values for v in values)
720 elif is_root_model:
721 union_type = tp.__fields__[ROOT_KEY].type_
722 alias, all_values = _get_union_alias_and_all_values(union_type, discriminator_key)
723
724 if len(set(all_values)) > 1:
725 raise ConfigError(
726 f'Field {discriminator_key!r} is not the same for all submodels of {display_as_type(tp)!r}'
727 )
728
729 return alias, all_values[0]
730
731 else:
732 try:
733 t_discriminator_type = tp.__fields__[discriminator_key].type_
734 except AttributeError as e:
735 raise TypeError(f'Type {tp.__name__!r} is not a valid `BaseModel` or `dataclass`') from e
736 except KeyError as e:
737 raise ConfigError(f'Model {tp.__name__!r} needs a discriminator field for key {discriminator_key!r}') from e
738
739 if not is_literal_type(t_discriminator_type):
740 raise ConfigError(f'Field {discriminator_key!r} of model {tp.__name__!r} needs to be a `Literal`')
741
742 return tp.__fields__[discriminator_key].alias, all_literal_values(t_discriminator_type)
743
744
745 def _get_union_alias_and_all_values(
746 union_type: Type[Any], discriminator_key: str
747 ) -> Tuple[str, Tuple[Tuple[str, ...], ...]]:
748 zipped_aliases_values = [get_discriminator_alias_and_values(t, discriminator_key) for t in get_args(union_type)]
749 # unzip: [('alias_a',('v1', 'v2)), ('alias_b', ('v3',))] => [('alias_a', 'alias_b'), (('v1', 'v2'), ('v3',))]
750 all_aliases, all_values = zip(*zipped_aliases_values)
751 return get_unique_discriminator_alias(all_aliases, discriminator_key), all_values
752
[end of pydantic/utils.py]
</code>
I need you to solve this issue by generating a single patch file that I can apply directly to this repository using git apply. Please respond with a single patch file in the following format.
<patch>
--- a/file.py
+++ b/file.py
@@ -1,27 +1,35 @@
def euclidean(a, b):
- while b:
- a, b = b, a % b
- return a
+ if b == 0:
+ return a
+ return euclidean(b, a % b)
def bresenham(x0, y0, x1, y1):
points = []
dx = abs(x1 - x0)
dy = abs(y1 - y0)
- sx = 1 if x0 < x1 else -1
- sy = 1 if y0 < y1 else -1
- err = dx - dy
+ x, y = x0, y0
+ sx = -1 if x0 > x1 else 1
+ sy = -1 if y0 > y1 else 1
- while True:
- points.append((x0, y0))
- if x0 == x1 and y0 == y1:
- break
- e2 = 2 * err
- if e2 > -dy:
+ if dx > dy:
+ err = dx / 2.0
+ while x != x1:
+ points.append((x, y))
err -= dy
- x0 += sx
- if e2 < dx:
- err += dx
- y0 += sy
+ if err < 0:
+ y += sy
+ err += dx
+ x += sx
+ else:
+ err = dy / 2.0
+ while y != y1:
+ points.append((x, y))
+ err -= dx
+ if err < 0:
+ x += sx
+ err += dy
+ y += sy
+ points.append((x, y))
return points
</patch>
| pydantic/pydantic | 8997cc5961139dd2695761a33c06a66adbf1430a | Model signatures contain reserved words.
### Checks
* [x] I added a descriptive title to this issue
* [x] I have searched (google, github) for similar issues and couldn't find anything
* [x] I have read and followed [the docs](https://pydantic-docs.helpmanual.io/) and still think this is a bug
# Bug
When creating a model with a field that has an alias that is a python reserved word, the signature uses the reserved word even when `allow_population_by_field_name` is true.
I expect the signature to use the field name in this case. I came across this issue when generating tests using hypothesis. This uses the model signature and was creating invalid test code as it was using the reserved words in the signature.
```
pydantic version: 1.9.0
pydantic compiled: False
python version: 3.9.0 (tags/v3.9.0:9cf6752, Oct 5 2020, 15:34:40) [MSC v.1927 64 bit (AMD64)]
platform: Windows-10-10.0.19041-SP0
optional deps. installed: ['devtools', 'dotenv', 'email-validator', 'typing-extensions']
```
```py
from pydantic import BaseModel, Field
from inspect import signature
class Foo(BaseModel):
from_: str = Field(..., alias='from')
class Config:
allow_population_by_field_name = True
str(signature(Foo))
```
> '(*, from: str) -> None'
# Suggestion
In `utils.generate_model_signature()` we have these two checks.
```py
elif not param_name.isidentifier():
if allow_names and field_name.isidentifier():
```
I propose replacing these with calls to the following (new) function.
```py
def is_valid_identifier(identifier: str) -> bool:
"""
Checks that a string is a valid identifier and not a reserved word.
:param identifier: The identifier to test.
:return: True if the identifier is valid.
"""
return identifier.isidentifier() and not keyword.iskeyword(identifier)
```
I believe that this behaviour is closer to what a user would expect in the common case that they are generating models from a schema containing python reserved words.
| 2022-04-25 16:42:47+00:00 | <patch>
diff --git a/pydantic/utils.py b/pydantic/utils.py
--- a/pydantic/utils.py
+++ b/pydantic/utils.py
@@ -1,3 +1,4 @@
+import keyword
import warnings
import weakref
from collections import OrderedDict, defaultdict, deque
@@ -56,6 +57,7 @@
'lenient_isinstance',
'lenient_issubclass',
'in_ipython',
+ 'is_valid_identifier',
'deep_update',
'update_not_none',
'almost_equal_floats',
@@ -192,6 +194,15 @@ def in_ipython() -> bool:
return True
+def is_valid_identifier(identifier: str) -> bool:
+ """
+ Checks that a string is a valid identifier and not a Python keyword.
+ :param identifier: The identifier to test.
+ :return: True if the identifier is valid.
+ """
+ return identifier.isidentifier() and not keyword.iskeyword(identifier)
+
+
KeyType = TypeVar('KeyType')
@@ -244,8 +255,8 @@ def generate_model_signature(
param_name = field.alias
if field_name in merged_params or param_name in merged_params:
continue
- elif not param_name.isidentifier():
- if allow_names and field_name.isidentifier():
+ elif not is_valid_identifier(param_name):
+ if allow_names and is_valid_identifier(field_name):
param_name = field_name
else:
use_var_kw = True
</patch> | diff --git a/tests/test_model_signature.py b/tests/test_model_signature.py
--- a/tests/test_model_signature.py
+++ b/tests/test_model_signature.py
@@ -84,6 +84,16 @@ class Config:
assert _equals(str(signature(Foo)), '(*, foo: str) -> None')
+def test_does_not_use_reserved_word():
+ class Foo(BaseModel):
+ from_: str = Field(..., alias='from')
+
+ class Config:
+ allow_population_by_field_name = True
+
+ assert _equals(str(signature(Foo)), '(*, from_: str) -> None')
+
+
def test_extra_allow_no_conflict():
class Model(BaseModel):
spam: str
| 0.0 | [
"tests/test_model_signature.py::test_does_not_use_reserved_word"
] | [
"tests/test_model_signature.py::test_model_signature",
"tests/test_model_signature.py::test_custom_init_signature",
"tests/test_model_signature.py::test_custom_init_signature_with_no_var_kw",
"tests/test_model_signature.py::test_invalid_identifiers_signature",
"tests/test_model_signature.py::test_use_field_name",
"tests/test_model_signature.py::test_extra_allow_no_conflict",
"tests/test_model_signature.py::test_extra_allow_conflict",
"tests/test_model_signature.py::test_extra_allow_conflict_twice",
"tests/test_model_signature.py::test_extra_allow_conflict_custom_signature",
"tests/test_model_signature.py::test_signature_is_class_only"
] | 8997cc5961139dd2695761a33c06a66adbf1430a |
|
django-money__django-money-594 | You will be provided with a partial code base and an issue statement explaining a problem to resolve.
<issue>
Money should not override `__rsub__` method
As [reported](https://github.com/py-moneyed/py-moneyed/issues/144) in pymoneyed the following test case fails:
```python
from moneyed import Money as PyMoney
from djmoney.money import Money
def test_sub_negative():
total = PyMoney(0, "EUR")
bills = (Money(8, "EUR"), Money(25, "EUR"))
for bill in bills:
total -= bill
assert total == Money(-33, "EUR")
# AssertionError: assert <Money: -17 EUR> == <Money: -33 EUR>
```
This is caused by `djmoney.money.Money` overriding `__rsub__`.
</issue>
<code>
[start of README.rst]
1 django-money
2 ============
3
4 .. image:: https://github.com/django-money/django-money/workflows/CI/badge.svg
5 :target: https://github.com/django-money/django-money/actions
6 :alt: Build Status
7
8 .. image:: http://codecov.io/github/django-money/django-money/coverage.svg?branch=master
9 :target: http://codecov.io/github/django-money/django-money?branch=master
10 :alt: Coverage Status
11
12 .. image:: https://readthedocs.org/projects/django-money/badge/?version=latest
13 :target: http://django-money.readthedocs.io/en/latest/
14 :alt: Documentation Status
15
16 .. image:: https://img.shields.io/pypi/v/django-money.svg
17 :target: https://pypi.python.org/pypi/django-money
18 :alt: PyPI
19
20 A little Django app that uses ``py-moneyed`` to add support for Money
21 fields in your models and forms.
22
23 * Django versions supported: 1.11, 2.1, 2.2, 3.0, 3.1
24 * Python versions supported: 3.5, 3.6, 3.7, 3.8, 3.9
25 * PyPy versions supported: PyPy3
26
27 If you need support for older versions of Django and Python, please refer to older releases mentioned in `the release notes <https://django-money.readthedocs.io/en/latest/changes.html>`__.
28
29 Through the dependency ``py-moneyed``, ``django-money`` gets:
30
31 * Support for proper Money value handling (using the standard Money
32 design pattern)
33 * A currency class and definitions for all currencies in circulation
34 * Formatting of most currencies with correct currency sign
35
36 Installation
37 ------------
38
39 Using `pip`:
40
41 .. code:: bash
42
43 $ pip install django-money
44
45 This automatically installs ``py-moneyed`` v0.8 (or later).
46
47 Add ``djmoney`` to your ``INSTALLED_APPS``. This is required so that money field are displayed correctly in the admin.
48
49 .. code:: python
50
51 INSTALLED_APPS = [
52 ...,
53 'djmoney',
54 ...
55 ]
56
57 Model usage
58 -----------
59
60 Use as normal model fields:
61
62 .. code:: python
63
64 from djmoney.models.fields import MoneyField
65 from django.db import models
66
67
68 class BankAccount(models.Model):
69 balance = MoneyField(max_digits=14, decimal_places=2, default_currency='USD')
70
71 To comply with certain strict accounting or financial regulations, you may consider using ``max_digits=19`` and ``decimal_places=4``, see more in this `StackOverflow answer <https://stackoverflow.com/a/224866/405682>`__
72
73 It is also possible to have a nullable ``MoneyField``:
74
75 .. code:: python
76
77 class BankAccount(models.Model):
78 money = MoneyField(max_digits=10, decimal_places=2, null=True, default_currency=None)
79
80 account = BankAccount.objects.create()
81 assert account.money is None
82 assert account.money_currency is None
83
84 Searching for models with money fields:
85
86 .. code:: python
87
88 from djmoney.money import Money
89
90
91 account = BankAccount.objects.create(balance=Money(10, 'USD'))
92 swissAccount = BankAccount.objects.create(balance=Money(10, 'CHF'))
93
94 BankAccount.objects.filter(balance__gt=Money(1, 'USD'))
95 # Returns the "account" object
96
97
98 Field validation
99 ----------------
100
101 There are 3 different possibilities for field validation:
102
103 * by numeric part of money despite on currency;
104 * by single money amount;
105 * by multiple money amounts.
106
107 All of them could be used in a combination as is shown below:
108
109 .. code:: python
110
111 from django.db import models
112 from djmoney.models.fields import MoneyField
113 from djmoney.money import Money
114 from djmoney.models.validators import MaxMoneyValidator, MinMoneyValidator
115
116
117 class BankAccount(models.Model):
118 balance = MoneyField(
119 max_digits=10,
120 decimal_places=2,
121 validators=[
122 MinMoneyValidator(10),
123 MaxMoneyValidator(1500),
124 MinMoneyValidator(Money(500, 'NOK')),
125 MaxMoneyValidator(Money(900, 'NOK')),
126 MinMoneyValidator({'EUR': 100, 'USD': 50}),
127 MaxMoneyValidator({'EUR': 1000, 'USD': 500}),
128 ]
129 )
130
131 The ``balance`` field from the model above has the following validation:
132
133 * All input values should be between 10 and 1500 despite on currency;
134 * Norwegian Crowns amount (NOK) should be between 500 and 900;
135 * Euros should be between 100 and 1000;
136 * US Dollars should be between 50 and 500;
137
138 Adding a new Currency
139 ---------------------
140
141 Currencies are listed on moneyed, and this modules use this to provide a
142 choice list on the admin, also for validation.
143
144 To add a new currency available on all the project, you can simple add
145 this two lines on your ``settings.py`` file
146
147 .. code:: python
148
149 import moneyed
150 from moneyed.localization import _FORMATTER
151 from decimal import ROUND_HALF_EVEN
152
153
154 BOB = moneyed.add_currency(
155 code='BOB',
156 numeric='068',
157 name='Peso boliviano',
158 countries=('BOLIVIA', )
159 )
160
161 # Currency Formatter will output 2.000,00 Bs.
162 _FORMATTER.add_sign_definition(
163 'default',
164 BOB,
165 prefix=u'Bs. '
166 )
167
168 _FORMATTER.add_formatting_definition(
169 'es_BO',
170 group_size=3, group_separator=".", decimal_point=",",
171 positive_sign="", trailing_positive_sign="",
172 negative_sign="-", trailing_negative_sign="",
173 rounding_method=ROUND_HALF_EVEN
174 )
175
176 To restrict the currencies listed on the project set a ``CURRENCIES``
177 variable with a list of Currency codes on ``settings.py``
178
179 .. code:: python
180
181 CURRENCIES = ('USD', 'BOB')
182
183 **The list has to contain valid Currency codes**
184
185 Additionally there is an ability to specify currency choices directly:
186
187 .. code:: python
188
189 CURRENCIES = ('USD', 'EUR')
190 CURRENCY_CHOICES = [('USD', 'USD $'), ('EUR', 'EUR €')]
191
192 Important note on model managers
193 --------------------------------
194
195 Django-money leaves you to use any custom model managers you like for
196 your models, but it needs to wrap some of the methods to allow searching
197 for models with money values.
198
199 This is done automatically for the "objects" attribute in any model that
200 uses MoneyField. However, if you assign managers to some other
201 attribute, you have to wrap your manager manually, like so:
202
203 .. code:: python
204
205 from djmoney.models.managers import money_manager
206
207
208 class BankAccount(models.Model):
209 balance = MoneyField(max_digits=10, decimal_places=2, default_currency='USD')
210 accounts = money_manager(MyCustomManager())
211
212 Also, the money\_manager wrapper only wraps the standard QuerySet
213 methods. If you define custom QuerySet methods, that do not end up using
214 any of the standard ones (like "get", "filter" and so on), then you also
215 need to manually decorate those custom methods, like so:
216
217 .. code:: python
218
219 from djmoney.models.managers import understands_money
220
221
222 class MyCustomQuerySet(QuerySet):
223
224 @understands_money
225 def my_custom_method(*args, **kwargs):
226 # Awesome stuff
227
228 Format localization
229 -------------------
230
231 The formatting is turned on if you have set ``USE_L10N = True`` in the
232 your settings file.
233
234 If formatting is disabled in the configuration, then in the templates
235 will be used default formatting.
236
237 In the templates you can use a special tag to format the money.
238
239 In the file ``settings.py`` add to ``INSTALLED_APPS`` entry from the
240 library ``djmoney``:
241
242 .. code:: python
243
244 INSTALLED_APPS += ('djmoney', )
245
246 In the template, add:
247
248 ::
249
250 {% load djmoney %}
251 ...
252 {% money_localize money %}
253
254 and that is all.
255
256 Instructions to the tag ``money_localize``:
257
258 ::
259
260 {% money_localize <money_object> [ on(default) | off ] [as var_name] %}
261 {% money_localize <amount> <currency> [ on(default) | off ] [as var_name] %}
262
263 Examples:
264
265 The same effect:
266
267 ::
268
269 {% money_localize money_object %}
270 {% money_localize money_object on %}
271
272 Assignment to a variable:
273
274 ::
275
276 {% money_localize money_object on as NEW_MONEY_OBJECT %}
277
278 Formatting the number with currency:
279
280 ::
281
282 {% money_localize '4.5' 'USD' %}
283
284 ::
285
286 Return::
287
288 Money object
289
290
291 Testing
292 -------
293
294 Install the required packages:
295
296 ::
297
298 git clone https://github.com/django-money/django-money
299
300 cd ./django-money/
301
302 pip install -e ".[test]" # installation with required packages for testing
303
304 Recommended way to run the tests:
305
306 .. code:: bash
307
308 tox
309
310 Testing the application in the current environment python:
311
312 .. code:: bash
313
314 make test
315
316 Working with Exchange Rates
317 ---------------------------
318
319 To work with exchange rates, add the following to your ``INSTALLED_APPS``.
320
321 .. code:: python
322
323 INSTALLED_APPS = [
324 ...,
325 'djmoney.contrib.exchange',
326 ]
327
328 Also, it is required to have ``certifi`` installed.
329 It could be done via installing ``djmoney`` with ``exchange`` extra:
330
331 .. code:: bash
332
333 pip install "django-money[exchange]"
334
335 To create required relations run ``python manage.py migrate``. To fill these relations with data you need to choose a
336 data source. Currently, 2 data sources are supported - https://openexchangerates.org/ (default) and https://fixer.io/.
337 To choose another data source set ``EXCHANGE_BACKEND`` settings with importable string to the backend you need:
338
339 .. code:: python
340
341 EXCHANGE_BACKEND = 'djmoney.contrib.exchange.backends.FixerBackend'
342
343 If you want to implement your own backend, you need to extend ``djmoney.contrib.exchange.backends.base.BaseExchangeBackend``.
344 Two data sources mentioned above are not open, so you have to specify access keys in order to use them:
345
346 ``OPEN_EXCHANGE_RATES_APP_ID`` - '<your actual key from openexchangerates.org>'
347
348 ``FIXER_ACCESS_KEY`` - '<your actual key from fixer.io>'
349
350 Backends return rates for a base currency, by default it is USD, but could be changed via ``BASE_CURRENCY`` setting.
351 Open Exchanger Rates & Fixer supports some extra stuff, like historical data or restricting currencies
352 in responses to the certain list. In order to use these features you could change default URLs for these backends:
353
354 .. code:: python
355
356 OPEN_EXCHANGE_RATES_URL = 'https://openexchangerates.org/api/historical/2017-01-01.json?symbols=EUR,NOK,SEK,CZK'
357 FIXER_URL = 'http://data.fixer.io/api/2013-12-24?symbols=EUR,NOK,SEK,CZK'
358
359 Or, you could pass it directly to ``update_rates`` method:
360
361 .. code:: python
362
363 >>> from djmoney.contrib.exchange.backends import OpenExchangeRatesBackend
364 >>> backend = OpenExchangeRatesBackend(url='https://openexchangerates.org/api/historical/2017-01-01.json')
365 >>> backend.update_rates(symbols='EUR,NOK,SEK,CZK')
366
367 There is a possibility to use multiple backends in the same time:
368
369 .. code:: python
370
371 >>> from djmoney.contrib.exchange.backends import FixerBackend, OpenExchangeRatesBackend
372 >>> from djmoney.contrib.exchange.models import get_rate
373 >>> OpenExchangeRatesBackend().update_rates()
374 >>> FixerBackend().update_rates()
375 >>> get_rate('USD', 'EUR', backend=OpenExchangeRatesBackend.name)
376 >>> get_rate('USD', 'EUR', backend=FixerBackend.name)
377
378 Regular operations with ``Money`` will use ``EXCHANGE_BACKEND`` backend to get the rates.
379 Also, there are two management commands for updating rates and removing them:
380
381 .. code:: bash
382
383 $ python manage.py update_rates
384 Successfully updated rates from openexchangerates.org
385 $ python manage.py clear_rates
386 Successfully cleared rates for openexchangerates.org
387
388 Both of them accept ``-b/--backend`` option, that will update/clear data only for this backend.
389 And ``clear_rates`` accepts ``-a/--all`` option, that will clear data for all backends.
390
391 To set up a periodic rates update you could use Celery task:
392
393 .. code:: python
394
395 CELERYBEAT_SCHEDULE = {
396 'update_rates': {
397 'task': 'path.to.your.task',
398 'schedule': crontab(minute=0, hour=0),
399 'kwargs': {} # For custom arguments
400 }
401 }
402
403 Example task implementation:
404
405 .. code:: python
406
407 from django.utils.module_loading import import_string
408
409 from celery import Celery
410 from djmoney import settings
411
412
413 app = Celery('tasks', broker='pyamqp://guest@localhost//')
414
415
416 @app.task
417 def update_rates(backend=settings.EXCHANGE_BACKEND, **kwargs):
418 backend = import_string(backend)()
419 backend.update_rates(**kwargs)
420
421 To convert one currency to another:
422
423 .. code:: python
424
425 >>> from djmoney.money import Money
426 >>> from djmoney.contrib.exchange.models import convert_money
427 >>> convert_money(Money(100, 'EUR'), 'USD')
428 <Money: 122.8184375038380800 USD>
429
430 Exchange rates are integrated with Django Admin.
431
432 django-money can be configured to automatically use this app for currency
433 conversions by settings ``AUTO_CONVERT_MONEY = True`` in your Django
434 settings. Note that currency conversion is a lossy process, so automatic
435 conversion is usually a good strategy only for very simple use cases. For most
436 use cases you will need to be clear about exactly when currency conversion
437 occurs, and automatic conversion can hide bugs. Also, with automatic conversion
438 you lose some properties like commutativity (``A + B == B + A``) due to
439 conversions happening in different directions.
440
441 Usage with Django REST Framework
442 --------------------------------
443
444 Make sure that ``djmoney`` and is in the ``INSTALLED_APPS`` of your
445 ``settings.py`` and that ``rest_framework`` has been installed. MoneyField will
446 automatically register a serializer for Django REST Framework through
447 ``djmoney.apps.MoneyConfig.ready()``.
448
449 You can add a serializable field the following way:
450
451 .. code:: python
452
453 from djmoney.contrib.django_rest_framework import MoneyField
454
455 class Serializers(serializers.Serializer):
456 my_computed_prop = MoneyField(max_digits=10, decimal_places=2)
457
458
459 Built-in serializer works in the following way:
460
461 .. code:: python
462
463 class Expenses(models.Model):
464 amount = MoneyField(max_digits=10, decimal_places=2)
465
466
467 class Serializer(serializers.ModelSerializer):
468 class Meta:
469 model = Expenses
470 fields = '__all__'
471
472 >>> instance = Expenses.objects.create(amount=Money(10, 'EUR'))
473 >>> serializer = Serializer(instance=instance)
474 >>> serializer.data
475 ReturnDict([
476 ('id', 1),
477 ('amount_currency', 'EUR'),
478 ('amount', '10.000'),
479 ])
480
481 Note that when specifying individual fields on your serializer, the amount and currency fields are treated separately.
482 To achieve the same behaviour as above you would include both field names:
483
484 .. code:: python
485
486 class Serializer(serializers.ModelSerializer):
487 class Meta:
488 model = Expenses
489 fields = ('id', 'amount', 'amount_currency')
490
491 Customization
492 -------------
493
494 If there is a need to customize the process deconstructing ``Money`` instances onto Django Fields and the other way around,
495 then it is possible to use a custom descriptor like this one:
496
497 .. code:: python
498
499 class MyMoneyDescriptor:
500
501 def __get__(self, obj, type=None):
502 amount = obj.__dict__[self.field.name]
503 return Money(amount, "EUR")
504
505 It will always use ``EUR`` for all ``Money`` instances when ``obj.money`` is called. Then it should be passed to ``MoneyField``:
506
507 .. code:: python
508
509 class Expenses(models.Model):
510 amount = MoneyField(max_digits=10, decimal_places=2, money_descriptor_class=MyMoneyDescriptor)
511
512
513 Background
514 ----------
515
516 This project is a fork of the Django support that was in
517 http://code.google.com/p/python-money/
518
519 This version adds tests, and comes with several critical bugfixes.
520
[end of README.rst]
[start of djmoney/money.py]
1 from django.conf import settings
2 from django.db.models import F
3 from django.utils import translation
4 from django.utils.deconstruct import deconstructible
5 from django.utils.html import avoid_wrapping, conditional_escape
6 from django.utils.safestring import mark_safe
7
8 from moneyed import Currency, Money as DefaultMoney
9 from moneyed.localization import _FORMATTER, format_money
10
11 from .settings import DECIMAL_PLACES, DECIMAL_PLACES_DISPLAY
12
13
14 __all__ = ["Money", "Currency"]
15
16
17 @deconstructible
18 class Money(DefaultMoney):
19 """
20 Extends functionality of Money with Django-related features.
21 """
22
23 use_l10n = None
24
25 def __init__(self, *args, decimal_places_display=None, **kwargs):
26 self.decimal_places = kwargs.pop("decimal_places", DECIMAL_PLACES)
27 self._decimal_places_display = decimal_places_display
28 super().__init__(*args, **kwargs)
29
30 @property
31 def decimal_places_display(self):
32 if self._decimal_places_display is None:
33 return DECIMAL_PLACES_DISPLAY.get(self.currency.code, self.decimal_places)
34
35 return self._decimal_places_display
36
37 @decimal_places_display.setter
38 def decimal_places_display(self, value):
39 """ Set number of digits being displayed - `None` resets to `DECIMAL_PLACES_DISPLAY` setting """
40 self._decimal_places_display = value
41
42 def _fix_decimal_places(self, *args):
43 """ Make sure to user 'biggest' number of decimal places of all given money instances """
44 candidates = (getattr(candidate, "decimal_places", 0) for candidate in args)
45 return max([self.decimal_places, *candidates])
46
47 def __add__(self, other):
48 if isinstance(other, F):
49 return other.__radd__(self)
50 other = maybe_convert(other, self.currency)
51 result = super().__add__(other)
52 result.decimal_places = self._fix_decimal_places(other)
53 return result
54
55 def __sub__(self, other):
56 if isinstance(other, F):
57 return other.__rsub__(self)
58 other = maybe_convert(other, self.currency)
59 result = super().__sub__(other)
60 result.decimal_places = self._fix_decimal_places(other)
61 return result
62
63 def __mul__(self, other):
64 if isinstance(other, F):
65 return other.__rmul__(self)
66 result = super().__mul__(other)
67 result.decimal_places = self._fix_decimal_places(other)
68 return result
69
70 def __truediv__(self, other):
71 if isinstance(other, F):
72 return other.__rtruediv__(self)
73 result = super().__truediv__(other)
74 if isinstance(result, self.__class__):
75 result.decimal_places = self._fix_decimal_places(other)
76 return result
77
78 def __rtruediv__(self, other):
79 # Backported from py-moneyd, non released bug-fix
80 # https://github.com/py-moneyed/py-moneyed/blob/c518745dd9d7902781409daec1a05699799474dd/moneyed/classes.py#L217-L218
81 raise TypeError("Cannot divide non-Money by a Money instance.")
82
83 @property
84 def is_localized(self):
85 if self.use_l10n is None:
86 return settings.USE_L10N
87 return self.use_l10n
88
89 def __str__(self):
90 kwargs = {"money": self, "decimal_places": self.decimal_places_display}
91 if self.is_localized:
92 locale = get_current_locale()
93 if locale:
94 kwargs["locale"] = locale
95
96 return format_money(**kwargs)
97
98 def __html__(self):
99 return mark_safe(avoid_wrapping(conditional_escape(str(self))))
100
101 def __round__(self, n=None):
102 amount = round(self.amount, n)
103 return self.__class__(amount, self.currency)
104
105 # DefaultMoney sets those synonym functions
106 # we overwrite the 'targets' so the wrong synonyms are called
107 # Example: we overwrite __add__; __radd__ calls __add__ on DefaultMoney...
108 __radd__ = __add__
109 __rsub__ = __sub__
110 __rmul__ = __mul__
111
112
113 def get_current_locale():
114 # get_language can return None starting from Django 1.8
115 language = translation.get_language() or settings.LANGUAGE_CODE
116 locale = translation.to_locale(language)
117
118 if locale.upper() in _FORMATTER.formatting_definitions:
119 return locale
120
121 locale = ("%s_%s" % (locale, locale)).upper()
122 if locale in _FORMATTER.formatting_definitions:
123 return locale
124
125 return ""
126
127
128 def maybe_convert(value, currency):
129 """
130 Converts other Money instances to the local currency if `AUTO_CONVERT_MONEY` is set to True.
131 """
132 if getattr(settings, "AUTO_CONVERT_MONEY", False) and value.currency != currency:
133 from .contrib.exchange.models import convert_money
134
135 return convert_money(value, currency)
136 return value
137
[end of djmoney/money.py]
[start of docs/changes.rst]
1 Changelog
2 =========
3
4 `Unreleased`_ - TBD
5 -------------------
6
7 **Added**
8
9 - Improved localization: New setting ``CURRENCY_DECIMAL_PLACES_DISPLAY`` configures decimal places to display for each configured currency `#521`_ (`wearebasti`_)
10
11 **Changed**
12
13 - Set the default value for ``models.fields.MoneyField`` to ``NOT_PROVIDED``.
14
15 **Fixed**
16
17 - Pin ``pymoneyed<1.0`` as it changed the ``repr`` output of the ``Money`` class.
18
19 `1.2.2`_ - 2020-12-29
20 ---------------------
21
22 **Fixed**
23
24 - Confusing "number-over-money" division behavior by backporting changes from ``py-moneyed``. `#586`_ (`wearebasti`_)
25 - ``AttributeError`` when a ``Money`` instance is divided by ``Money``. `#585`_ (`niklasb`_)
26
27 `1.2.1`_ - 2020-11-29
28 ---------------------
29
30 **Fixed**
31
32 - Aggregation through a proxy model. `#583`_ (`tned73`_)
33
34 `1.2`_ - 2020-11-26
35 -------------------
36
37 **Fixed**
38
39 - Resulting Money object from arithmetics (add / sub / ...) inherits maximum decimal_places from arguments `#522`_ (`wearebasti`_)
40 - ``DeprecationWarning`` related to the usage of ``cafile`` in ``urlopen``. `#553`_ (`Stranger6667`_)
41
42 **Added**
43
44 - Django 3.1 support
45
46 `1.1`_ - 2020-04-06
47 -------------------
48
49 **Fixed**
50
51 - Optimize money operations on MoneyField instances with the same currencies. `#541`_ (`horpto`_)
52
53 **Added**
54
55 - Support for ``Money`` type in ``QuerySet.bulk_update()`` `#534`_ (`satels`_)
56
57 `1.0`_ - 2019-11-08
58 -------------------
59
60 **Added**
61
62 - Support for money descriptor customization. (`Stranger6667`_)
63 - Fix ``order_by()`` not returning money-compatible queryset `#519`_ (`lieryan`_)
64 - Django 3.0 support
65
66 **Removed**
67
68 - Support for Django 1.8 & 2.0. (`Stranger6667`_)
69 - Support for Python 2.7. `#515`_ (`benjaoming`_)
70 - Support for Python 3.4. (`Stranger6667`_)
71 - ``MoneyPatched``, use ``djmoney.money.Money`` instead. (`Stranger6667`_)
72
73 **Fixed**
74
75 - Support instances with ``decimal_places=0`` `#509`_ (`fara`_)
76
77 `0.15.1`_ - 2019-06-22
78 ----------------------
79
80 **Fixed**
81
82 - Respect field ``decimal_places`` when instantiating ``Money`` object from field db values. `#501`_ (`astutejoe`_)
83 - Restored linting in CI tests (`benjaoming`_)
84
85 `0.15`_ - 2019-05-30
86 --------------------
87
88 .. warning:: This release contains backwards incompatibility, please read the release notes below.
89
90 Backwards incompatible changes
91 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
92
93 - Remove implicit default value on non-nullable MoneyFields.
94 Backwards incompatible change: set explicit ``default=0.0`` to keep previous behavior. `#411`_ (`washeck`_)
95 - Remove support for calling ``float`` on ``Money`` instances. Use the ``amount`` attribute instead. (`Stranger6667`_)
96 - ``MinMoneyValidator`` and ``MaxMoneyValidator`` are not inherited from Django's ``MinValueValidator`` and ``MaxValueValidator`` anymore. `#376`_
97 - In model and non-model forms ``forms.MoneyField`` uses ``CURRENCY_DECIMAL_PLACES`` as the default value for ``decimal_places``. `#434`_ (`Stranger6667`_, `andytwoods`_)
98
99 **Added**
100
101 - Add ``Money.decimal_places`` for per-instance configuration of decimal places in the string representation.
102 - Support for customization of ``CurrencyField`` length. Some cryptocurrencies could have codes longer than three characters. `#480`_ (`Stranger6667`_, `MrFus10n`_)
103 - Add ``default_currency`` option for REST Framework field. `#475`_ (`butorov`_)
104
105 **Fixed**
106
107 - Failing certificates checks when accessing 3rd party exchange rates backends.
108 Fixed by adding `certifi` to the dependencies list. `#403`_ (`Stranger6667`_)
109 - Fixed model-level ``validators`` behavior in REST Framework. `#376`_ (`rapIsKal`_, `Stranger6667`_)
110 - Setting keyword argument ``default_currency=None`` for ``MoneyField`` did not revert to ``settings.DEFAULT_CURRENCY`` and set ``str(None)`` as database value for currency. `#490`_ (`benjaoming`_)
111
112 **Changed**
113
114 - Allow using patched ``django.core.serializers.python._get_model`` in serializers, which could be helpful for
115 migrations. (`Formulka`_, `Stranger6667`_)
116
117 `0.14.4`_ - 2019-01-07
118 ----------------------
119
120 **Changed**
121
122 - Re-raise arbitrary exceptions in JSON deserializer as `DeserializationError`. (`Stranger6667`_)
123
124 **Fixed**
125
126 - Invalid Django 1.8 version check in ``djmoney.models.fields.MoneyField.value_to_string``. (`Stranger6667`_)
127 - InvalidOperation in ``djmoney.contrib.django_rest_framework.fields.MoneyField.get_value`` when amount is None and currency is not None. `#458`_ (`carvincarl`_)
128
129 `0.14.3`_ - 2018-08-14
130 ----------------------
131
132 **Fixed**
133
134 - ``djmoney.forms.widgets.MoneyWidget`` decompression on Django 2.1+. `#443`_ (`Stranger6667`_)
135
136 `0.14.2`_ - 2018-07-23
137 ----------------------
138
139 **Fixed**
140
141 - Validation of ``djmoney.forms.fields.MoneyField`` when ``disabled=True`` is passed to it. `#439`_ (`stinovlas`_, `Stranger6667`_)
142
143 `0.14.1`_ - 2018-07-17
144 ----------------------
145
146 **Added**
147
148 - Support for indirect rates conversion through maximum 1 extra step (when there is no direct conversion rate:
149 converting by means of a third currency for which both source and target currency have conversion
150 rates). `#425`_ (`Stranger6667`_, `77cc33`_)
151
152 **Fixed**
153
154 - Error was raised when trying to do a query with a `ModelWithNullableCurrency`. `#427`_ (`Woile`_)
155
156 `0.14`_ - 2018-06-09
157 --------------------
158
159 **Added**
160
161 - Caching of exchange rates. `#398`_ (`Stranger6667`_)
162 - Added support for nullable ``CurrencyField``. `#260`_ (`Stranger6667`_)
163
164 **Fixed**
165
166 - Same currency conversion getting MissingRate exception `#418`_ (`humrochagf`_)
167 - `TypeError` during templatetag usage inside a for loop on Django 2.0. `#402`_ (`f213`_)
168
169 **Removed**
170
171 - Support for Python 3.3 `#410`_ (`benjaoming`_)
172 - Deprecated ``choices`` argument from ``djmoney.forms.fields.MoneyField``. Use ``currency_choices`` instead. (`Stranger6667`_)
173
174 `0.13.5`_ - 2018-05-19
175 ----------------------
176
177 **Fixed**
178
179 - Missing in dist, ``djmoney/__init__.py``. `#417`_ (`benjaoming`_)
180
181 `0.13.4`_ - 2018-05-19
182 ----------------------
183
184 **Fixed**
185
186 - Packaging of ``djmoney.contrib.exchange.management.commands``. `#412`_ (`77cc33`_, `Stranger6667`_)
187
188 `0.13.3`_ - 2018-05-12
189 ----------------------
190
191 **Added**
192
193 - Rounding support via ``round`` built-in function on Python 3. (`Stranger6667`_)
194
195 `0.13.2`_ - 2018-04-16
196 ----------------------
197
198 **Added**
199
200 - Django Admin integration for exchange rates. `#392`_ (`Stranger6667`_)
201
202 **Fixed**
203
204 - Exchange rates. TypeError when decoding JSON on Python 3.3-3.5. `#399`_ (`kcyeu`_)
205 - Managers patching for models with custom ``Meta.default_manager_name``. `#400`_ (`Stranger6667`_)
206
207 `0.13.1`_ - 2018-04-07
208 ----------------------
209
210 **Fixed**
211
212 - Regression: Could not run w/o ``django.contrib.exchange`` `#388`_ (`Stranger6667`_)
213
214 `0.13`_ - 2018-04-07
215 --------------------
216
217 **Added**
218
219 - Currency exchange `#385`_ (`Stranger6667`_)
220
221 **Removed**
222
223 - Support for ``django-money-rates`` `#385`_ (`Stranger6667`_)
224 - Deprecated ``Money.__float__`` which is implicitly called on some ``sum()`` operations `#347`_. (`jonashaag`_)
225
226 Migration from django-money-rates
227 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
228
229 The new application is a drop-in replacement for ``django-money-rates``.
230 To migrate from ``django-money-rates``:
231
232 - In ``INSTALLED_APPS`` replace ``djmoney_rates`` with ``djmoney.contrib.exchange``
233 - Set ``OPEN_EXCHANGE_RATES_APP_ID`` setting with your app id
234 - Run ``python manage.py migrate``
235 - Run ``python manage.py update_rates``
236
237 For more information, look at ``Working with Exchange Rates`` section in README.
238
239 `0.12.3`_ - 2017-12-13
240 ----------------------
241
242 **Fixed**
243
244 - Fixed ``BaseMoneyValidator`` with falsy limit values. `#371`_ (`1337`_)
245
246 `0.12.2`_ - 2017-12-12
247 ----------------------
248
249 **Fixed**
250
251 - Django master branch compatibility. `#361`_ (`Stranger6667`_)
252 - Fixed ``get_or_create`` for models with shared currency. `#364`_ (`Stranger6667`_)
253
254 **Changed**
255
256 - Removed confusing rounding to integral value in ``Money.__repr__``. `#366`_ (`Stranger6667`_, `evenicoulddoit`_)
257
258 `0.12.1`_ - 2017-11-20
259 ----------------------
260
261 **Fixed**
262
263 - Fixed migrations on SQLite. `#139`_, `#338`_ (`Stranger6667`_)
264 - Fixed ``Field.rel.to`` usage for Django 2.0. `#349`_ (`richardowen`_)
265 - Fixed Django REST Framework behaviour for serializers without ``*_currency`` field in serializer's ``Meta.fields``. `#351`_ (`elcolie`_, `Stranger6667`_)
266
267 `0.12`_ - 2017-10-22
268 --------------------
269
270 **Added**
271
272 - Ability to specify name for currency field. `#195`_ (`Stranger6667`_)
273 - Validators for ``MoneyField``. `#308`_ (`Stranger6667`_)
274
275 **Changed**
276
277 - Improved ``Money`` support. Now ``django-money`` fully relies on ``pymoneyed`` localization everywhere, including Django admin. `#276`_ (`Stranger6667`_)
278 - Implement ``__html__`` method. If used in Django templates, an ``Money`` object's amount and currency are now separated with non-breaking space (`` ``) `#337`_ (`jonashaag`_)
279
280 **Deprecated**
281
282 - ``djmoney.models.fields.MoneyPatched`` and ``moneyed.Money`` are deprecated. Use ``djmoney.money.Money`` instead.
283
284 **Fixed**
285
286 - Fixed model field validation. `#308`_ (`Stranger6667`_).
287 - Fixed managers caching for Django >= 1.10. `#318`_ (`Stranger6667`_).
288 - Fixed ``F`` expressions support for ``in`` lookups. `#321`_ (`Stranger6667`_).
289 - Fixed money comprehension on querysets. `#331`_ (`Stranger6667`_, `jaavii1988`_).
290 - Fixed errors in Django Admin integration. `#334`_ (`Stranger6667`_, `adi-`_).
291
292 **Removed**
293
294 - Dropped support for Python 2.6 and 3.2. (`Stranger6667`_)
295 - Dropped support for Django 1.4, 1.5, 1.6, 1.7 and 1.9. (`Stranger6667`_)
296
297 `0.11.4`_ - 2017-06-26
298 ----------------------
299
300 **Fixed**
301
302 - Fixed money parameters processing in update queries. `#309`_ (`Stranger6667`_)
303
304 `0.11.3`_ - 2017-06-19
305 ----------------------
306
307 **Fixed**
308
309 - Restored support for Django 1.4, 1.5, 1.6, and 1.7 & Python 2.6 `#304`_ (`Stranger6667`_)
310
311 `0.11.2`_ - 2017-05-31
312 ----------------------
313
314 **Fixed**
315
316 - Fixed field lookup regression. `#300`_ (`lmdsp`_, `Stranger6667`_)
317
318 `0.11.1`_ - 2017-05-26
319 ----------------------
320
321 **Fixed**
322
323 - Fixed access to models properties. `#297`_ (`mithrilstar`_, `Stranger6667`_)
324
325 **Removed**
326
327 - Dropped support for Python 2.6. (`Stranger6667`_)
328 - Dropped support for Django < 1.8. (`Stranger6667`_)
329
330 `0.11`_ - 2017-05-19
331 --------------------
332
333 **Added**
334
335 - An ability to set custom currency choices via ``CURRENCY_CHOICES`` settings option. `#211`_ (`Stranger6667`_, `ChessSpider`_)
336
337 **Fixed**
338
339 - Fixed ``AttributeError`` in ``get_or_create`` when the model have no default. `#268`_ (`Stranger6667`_, `lobziik`_)
340 - Fixed ``UnicodeEncodeError`` in string representation of ``MoneyPatched`` on Python 2. `#272`_ (`Stranger6667`_)
341 - Fixed various displaying errors in Django Admin . `#232`_, `#220`_, `#196`_, `#102`_, `#90`_ (`Stranger6667`_,
342 `arthurk`_, `mstarostik`_, `eriktelepovsky`_, `jplehmann`_, `graik`_, `benjaoming`_, `k8n`_, `yellow-sky`_)
343 - Fixed non-Money values support for ``in`` lookup. `#278`_ (`Stranger6667`_)
344 - Fixed available lookups with removing of needless lookup check. `#277`_ (`Stranger6667`_)
345 - Fixed compatibility with ``py-moneyed``. (`Stranger6667`_)
346 - Fixed ignored currency value in Django REST Framework integration. `#292`_ (`gonzalobf`_)
347
348 `0.10.2`_ - 2017-02-18
349 ----------------------
350
351 **Added**
352
353 - Added ability to configure decimal places output. `#154`_, `#251`_ (`ivanchenkodmitry`_)
354
355 **Fixed**
356
357 - Fixed handling of ``defaults`` keyword argument in ``get_or_create`` method. `#257`_ (`kjagiello`_)
358 - Fixed handling of currency fields lookups in ``get_or_create`` method. `#258`_ (`Stranger6667`_)
359 - Fixed ``PendingDeprecationWarning`` during form initialization. `#262`_ (`Stranger6667`_, `spookylukey`_)
360 - Fixed handling of ``F`` expressions which involve non-Money fields. `#265`_ (`Stranger6667`_)
361
362 `0.10.1`_ - 2016-12-26
363 ----------------------
364
365 **Fixed**
366
367 - Fixed default value for ``djmoney.forms.fields.MoneyField``. `#249`_ (`tsouvarev`_)
368
369 `0.10`_ - 2016-12-19
370 --------------------
371
372 **Changed**
373
374 - Do not fail comparisons because of different currency. Just return ``False`` `#225`_ (`benjaoming`_ and `ivirabyan`_)
375
376 **Fixed**
377
378 - Fixed ``understands_money`` behaviour. Now it can be used as a decorator `#215`_ (`Stranger6667`_)
379 - Fixed: Not possible to revert MoneyField currency back to default `#221`_ (`benjaoming`_)
380 - Fixed invalid ``creation_counter`` handling. `#235`_ (`msgre`_ and `Stranger6667`_)
381 - Fixed broken field resolving. `#241`_ (`Stranger6667`_)
382
383 `0.9.1`_ - 2016-08-01
384 ---------------------
385
386 **Fixed**
387
388 - Fixed packaging.
389
390 `0.9.0`_ - 2016-07-31
391 ---------------------
392
393 NB! If you are using custom model managers **not** named ``objects`` and you expect them to still work, please read below.
394
395 **Added**
396
397 - Support for ``Value`` and ``Func`` expressions in queries. (`Stranger6667`_)
398 - Support for ``in`` lookup. (`Stranger6667`_)
399 - Django REST Framework support. `#179`_ (`Stranger6667`_)
400 - Django 1.10 support. `#198`_ (`Stranger6667`_)
401 - Improved South support. (`Stranger6667`_)
402
403 **Changed**
404
405 - Changed auto conversion of currencies using djmoney_rates (added in 0.7.3) to
406 be off by default. You must now add ``AUTO_CONVERT_MONEY = True`` in
407 your ``settings.py`` if you want this feature. `#199`_ (`spookylukey`_)
408 - Only make ``objects`` a MoneyManager instance automatically. `#194`_ and `#201`_ (`inureyes`_)
409
410 **Fixed**
411
412 - Fixed default currency value for nullable fields in forms. `#138`_ (`Stranger6667`_)
413 - Fixed ``_has_changed`` deprecation warnings. `#206`_ (`Stranger6667`_)
414 - Fixed ``get_or_create`` crash, when ``defaults`` is passed. `#213`_ (`Stranger6667`_, `spookylukey`_)
415
416 Note about automatic model manager patches
417 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
418
419 In 0.8, Django-money automatically patches every model managers with
420 ``MoneyManager``. This causes migration problems if two or more managers are
421 used in the same model.
422
423 As a side effect, other managers are also finally wrapped with ``MoneyManager``.
424 This effect leads Django migration to point to fields with other managers to
425 ``MoneyManager``, and raises ``ValueError`` (``MoneyManager`` only exists as a
426 return of ``money_manager``, not a class-form. However migration procedure tries
427 to find ``MoneyManager`` to patch other managers.)
428
429 From 0.9, Django-money only patches ``objects`` with ``MoneyManager`` by default
430 (as documented). To patch other managers (e.g. custom managers), patch them by
431 wrapping with ``money_manager``.
432
433 .. code-block:: python
434
435 from djmoney.models.managers import money_manager
436
437
438 class BankAccount(models.Model):
439 balance = MoneyField(max_digits=10, decimal_places=2, default_currency='USD')
440 accounts = money_manager(MyCustomManager())
441
442 `0.8`_ - 2016-04-23
443 -------------------
444
445 **Added**
446
447 - Support for serialization of ``MoneyPatched`` instances in migrations. (`AlexRiina`_)
448 - Improved django-money-rates support. `#173`_ (`Stranger6667`_)
449 - Extended ``F`` expressions support. (`Stranger6667`_)
450 - Pre-commit hooks support. (`benjaoming`_)
451 - Isort integration. (`Stranger6667`_)
452 - Makefile for common commands. (`Stranger6667`_)
453 - Codecov.io integration. (`Stranger6667`_)
454 - Python 3.5 builds to tox.ini and travis.yml. (`Stranger6667`_)
455 - Django master support. (`Stranger6667`_)
456 - Python 3.2 compatibility. (`Stranger6667`_)
457
458 **Changed**
459
460 - Refactored test suite (`Stranger6667`_)
461
462 **Fixed**
463
464 - Fixed fields caching. `#186`_ (`Stranger6667`_)
465 - Fixed m2m fields data loss on Django < 1.8. `#184`_ (`Stranger6667`_)
466 - Fixed managers access via instances. `#86`_ (`Stranger6667`_)
467 - Fixed currency handling behaviour. `#172`_ (`Stranger6667`_)
468 - Many PEP8 & flake8 fixes. (`benjaoming`_)
469 - Fixed filtration with ``F`` expressions. `#174`_ (`Stranger6667`_)
470 - Fixed querying on Django 1.8+. `#166`_ (`Stranger6667`_)
471
472 `0.7.6`_ - 2016-01-08
473 ---------------------
474
475 **Added**
476
477 - Added correct paths for py.test discovery. (`benjaoming`_)
478 - Mention Django 1.9 in tox.ini. (`benjaoming`_)
479
480 **Fixed**
481
482 - Fix for ``get_or_create`` / ``create`` manager methods not respecting currency code. (`toudi`_)
483 - Fix unit tests. (`toudi`_)
484 - Fix for using ``MoneyField`` with ``F`` expressions when using Django >= 1.8. (`toudi`_)
485
486 `0.7.5`_ - 2015-12-22
487 ---------------------
488
489 **Fixed**
490
491 - Fallback to ``_meta.fields`` if ``_meta.get_fields`` raises ``AttributeError`` `#149`_ (`browniebroke`_)
492 - pip instructions updated. (`GheloAce`_)
493
494 `0.7.4`_ - 2015-11-02
495 ---------------------
496
497 **Added**
498
499 - Support for Django 1.9 (`kjagiello`_)
500
501 **Fixed**
502
503 - Fixed loaddata. (`jack-cvr`_)
504 - Python 2.6 fixes. (`jack-cvr`_)
505 - Fixed currency choices ordering. (`synotna`_)
506
507 `0.7.3`_ - 2015-10-16
508 ---------------------
509
510 **Added**
511
512 - Sum different currencies. (`dnmellen`_)
513 - ``__eq__`` method. (`benjaoming`_)
514 - Comparison of different currencies. (`benjaoming`_)
515 - Default currency. (`benjaoming`_)
516
517 **Fixed**
518
519 - Fix using Choices for setting currency choices. (`benjaoming`_)
520 - Fix tests for Python 2.6. (`plumdog`_)
521
522 `0.7.2`_ - 2015-09-01
523 ---------------------
524
525 **Fixed**
526
527 - Better checks on ``None`` values. (`tsouvarev`_, `sjdines`_)
528 - Consistency with South declarations and calling ``str`` function. (`sjdines`_)
529
530 `0.7.1`_ - 2015-08-11
531 ---------------------
532
533 **Fixed**
534
535 - Fix bug in printing ``MoneyField``. (`YAmikep`_)
536 - Added fallback value for current locale getter. (`sjdines`_)
537
538 `0.7.0`_ - 2015-06-14
539 ---------------------
540
541 **Added**
542
543 - Django 1.8 compatibility. (`willhcr`_)
544
545 `0.6.0`_ - 2015-05-23
546 ---------------------
547
548 **Added**
549
550 - Python 3 trove classifier. (`dekkers`_)
551
552 **Changed**
553
554 - Tox cleanup. (`edwinlunando`_)
555 - Improved ``README``. (`glarrain`_)
556 - Added/Cleaned up tests. (`spookylukey`_, `AlexRiina`_)
557
558 **Fixed**
559
560 - Append ``_currency`` to non-money ExpressionFields. `#101`_ (`alexhayes`_, `AlexRiina`_, `briankung`_)
561 - Data truncated for column. `#103`_ (`alexhayes`_)
562 - Fixed ``has_changed`` not working. `#95`_ (`spookylukey`_)
563 - Fixed proxy model with ``MoneyField`` returns wrong class. `#80`_ (`spookylukey`_)
564
565 `0.5.0`_ - 2014-12-15
566 ---------------------
567
568 **Added**
569
570 - Django 1.7 compatibility. (`w00kie`_)
571
572 **Fixed**
573
574 - Added ``choices=`` to instantiation of currency widget. (`davidstockwell`_)
575 - Nullable ``MoneyField`` should act as ``default=None``. (`jakewins`_)
576 - Fixed bug where a non-required ``MoneyField`` threw an exception. (`spookylukey`_)
577
578 `0.4.2`_ - 2014-07-31
579 ---------------------
580 `0.4.1`_ - 2013-11-28
581 ---------------------
582 `0.4.0.0`_ - 2013-11-26
583 -----------------------
584
585 **Added**
586
587 - Python 3 compatibility.
588 - tox tests.
589 - Format localization.
590 - Template tag ``money_localize``.
591
592 `0.3.4`_ - 2013-11-25
593 ---------------------
594 `0.3.3.2`_ - 2013-10-31
595 -----------------------
596 `0.3.3.1`_ - 2013-10-01
597 -----------------------
598 `0.3.3`_ - 2013-02-17
599 ---------------------
600
601 **Added**
602
603 - South support via implementing the ``south_triple_field`` method. (`mattions`_)
604
605 **Fixed**
606
607 - Fixed issues with money widget not passing attrs up to django's render method, caused id attribute to not be set in html for widgets. (`adambregenzer`_)
608 - Fixed issue of default currency not being passed on to widget. (`snbuchholz`_)
609 - Return the right default for South. (`mattions`_)
610 - Django 1.5 compatibility. (`devlocal`_)
611
612 `0.3.2`_ - 2012-11-30
613 ---------------------
614
615 **Fixed**
616
617 - Fixed issues with ``display_for_field`` not detecting fields correctly. (`adambregenzer`_)
618 - Added South ignore rule to avoid duplicate currency field when using the frozen ORM. (`rach`_)
619 - Disallow override of objects manager if not setting it up with an instance. (`rach`_)
620
621 `0.3.1`_ - 2012-10-11
622 ---------------------
623
624 **Fixed**
625
626 - Fix ``AttributeError`` when Model inherit a manager. (`rach`_)
627 - Correctly serialize the field. (`akumria`_)
628
629 `0.3`_ - 2012-09-30
630 -------------------
631
632 **Added**
633
634 - Allow django-money to be specified as read-only in a model. (`akumria`_)
635 - South support: Declare default attribute values. (`pjdelport`_)
636
637 `0.2`_ - 2012-04-10
638 -------------------
639
640 - Initial public release
641
642 .. _Unreleased: https:///github.com/django-money/django-money/compare/1.2.2...HEAD
643 .. _1.2.2: https://github.com/django-money/django-money/compare/1.2.1...1.2.2
644 .. _1.2.1: https://github.com/django-money/django-money/compare/1.2...1.2.1
645 .. _1.2: https://github.com/django-money/django-money/compare/1.1...1.2
646 .. _1.1: https://github.com/django-money/django-money/compare/1.0...1.1
647 .. _1.0: https://github.com/django-money/django-money/compare/0.15.1...1.0
648 .. _0.15.1: https://github.com/django-money/django-money/compare/0.15.1...0.15
649 .. _0.15: https://github.com/django-money/django-money/compare/0.15...0.14.4
650 .. _0.14.4: https://github.com/django-money/django-money/compare/0.14.4...0.14.3
651 .. _0.14.3: https://github.com/django-money/django-money/compare/0.14.3...0.14.2
652 .. _0.14.2: https://github.com/django-money/django-money/compare/0.14.2...0.14.1
653 .. _0.14.1: https://github.com/django-money/django-money/compare/0.14.1...0.14
654 .. _0.14: https://github.com/django-money/django-money/compare/0.14...0.13.5
655 .. _0.13.5: https://github.com/django-money/django-money/compare/0.13.4...0.13.5
656 .. _0.13.4: https://github.com/django-money/django-money/compare/0.13.3...0.13.4
657 .. _0.13.3: https://github.com/django-money/django-money/compare/0.13.2...0.13.3
658 .. _0.13.2: https://github.com/django-money/django-money/compare/0.13.1...0.13.2
659 .. _0.13.1: https://github.com/django-money/django-money/compare/0.13...0.13.1
660 .. _0.13: https://github.com/django-money/django-money/compare/0.12.3...0.13
661 .. _0.12.3: https://github.com/django-money/django-money/compare/0.12.2...0.12.3
662 .. _0.12.2: https://github.com/django-money/django-money/compare/0.12.1...0.12.2
663 .. _0.12.1: https://github.com/django-money/django-money/compare/0.12...0.12.1
664 .. _0.12: https://github.com/django-money/django-money/compare/0.11.4...0.12
665 .. _0.11.4: https://github.com/django-money/django-money/compare/0.11.3...0.11.4
666 .. _0.11.3: https://github.com/django-money/django-money/compare/0.11.2...0.11.3
667 .. _0.11.2: https://github.com/django-money/django-money/compare/0.11.1...0.11.2
668 .. _0.11.1: https://github.com/django-money/django-money/compare/0.11...0.11.1
669 .. _0.11: https://github.com/django-money/django-money/compare/0.10.2...0.11
670 .. _0.10.2: https://github.com/django-money/django-money/compare/0.10.1...0.10.2
671 .. _0.10.1: https://github.com/django-money/django-money/compare/0.10...0.10.1
672 .. _0.10: https://github.com/django-money/django-money/compare/0.9.1...0.10
673 .. _0.9.1: https://github.com/django-money/django-money/compare/0.9.0...0.9.1
674 .. _0.9.0: https://github.com/django-money/django-money/compare/0.8...0.9.0
675 .. _0.8: https://github.com/django-money/django-money/compare/0.7.6...0.8
676 .. _0.7.6: https://github.com/django-money/django-money/compare/0.7.5...0.7.6
677 .. _0.7.5: https://github.com/django-money/django-money/compare/0.7.4...0.7.5
678 .. _0.7.4: https://github.com/django-money/django-money/compare/0.7.3...0.7.4
679 .. _0.7.3: https://github.com/django-money/django-money/compare/0.7.2...0.7.3
680 .. _0.7.2: https://github.com/django-money/django-money/compare/0.7.1...0.7.2
681 .. _0.7.1: https://github.com/django-money/django-money/compare/0.7.0...0.7.1
682 .. _0.7.0: https://github.com/django-money/django-money/compare/0.6.0...0.7.0
683 .. _0.6.0: https://github.com/django-money/django-money/compare/0.5.0...0.6.0
684 .. _0.5.0: https://github.com/django-money/django-money/compare/0.4.2...0.5.0
685 .. _0.4.2: https://github.com/django-money/django-money/compare/0.4.1...0.4.2
686 .. _0.4.1: https://github.com/django-money/django-money/compare/0.4.0.0...0.4.1
687 .. _0.4.0.0: https://github.com/django-money/django-money/compare/0.3.4...0.4.0.0
688 .. _0.3.4: https://github.com/django-money/django-money/compare/0.3.3.2...0.3.4
689 .. _0.3.3.2: https://github.com/django-money/django-money/compare/0.3.3.1...0.3.3.2
690 .. _0.3.3.1: https://github.com/django-money/django-money/compare/0.3.3...0.3.3.1
691 .. _0.3.3: https://github.com/django-money/django-money/compare/0.3.2...0.3.3
692 .. _0.3.2: https://github.com/django-money/django-money/compare/0.3.1...0.3.2
693 .. _0.3.1: https://github.com/django-money/django-money/compare/0.3...0.3.1
694 .. _0.3: https://github.com/django-money/django-money/compare/0.2...0.3
695 .. _0.2: https://github.com/django-money/django-money/compare/0.2...a6d90348085332a393abb40b86b5dd9505489b04
696
697 .. _#586: https://github.com/django-money/django-money/issues/586
698 .. _#585: https://github.com/django-money/django-money/pull/585
699 .. _#583: https://github.com/django-money/django-money/issues/583
700 .. _#553: https://github.com/django-money/django-money/issues/553
701 .. _#541: https://github.com/django-money/django-money/issues/541
702 .. _#534: https://github.com/django-money/django-money/issues/534
703 .. _#515: https://github.com/django-money/django-money/issues/515
704 .. _#509: https://github.com/django-money/django-money/issues/509
705 .. _#501: https://github.com/django-money/django-money/issues/501
706 .. _#490: https://github.com/django-money/django-money/issues/490
707 .. _#475: https://github.com/django-money/django-money/issues/475
708 .. _#480: https://github.com/django-money/django-money/issues/480
709 .. _#458: https://github.com/django-money/django-money/issues/458
710 .. _#443: https://github.com/django-money/django-money/issues/443
711 .. _#439: https://github.com/django-money/django-money/issues/439
712 .. _#434: https://github.com/django-money/django-money/issues/434
713 .. _#427: https://github.com/django-money/django-money/pull/427
714 .. _#425: https://github.com/django-money/django-money/issues/425
715 .. _#417: https://github.com/django-money/django-money/issues/417
716 .. _#412: https://github.com/django-money/django-money/issues/412
717 .. _#410: https://github.com/django-money/django-money/issues/410
718 .. _#403: https://github.com/django-money/django-money/issues/403
719 .. _#402: https://github.com/django-money/django-money/issues/402
720 .. _#400: https://github.com/django-money/django-money/issues/400
721 .. _#399: https://github.com/django-money/django-money/issues/399
722 .. _#398: https://github.com/django-money/django-money/issues/398
723 .. _#392: https://github.com/django-money/django-money/issues/392
724 .. _#388: https://github.com/django-money/django-money/issues/388
725 .. _#385: https://github.com/django-money/django-money/issues/385
726 .. _#376: https://github.com/django-money/django-money/issues/376
727 .. _#347: https://github.com/django-money/django-money/issues/347
728 .. _#371: https://github.com/django-money/django-money/issues/371
729 .. _#366: https://github.com/django-money/django-money/issues/366
730 .. _#364: https://github.com/django-money/django-money/issues/364
731 .. _#361: https://github.com/django-money/django-money/issues/361
732 .. _#351: https://github.com/django-money/django-money/issues/351
733 .. _#349: https://github.com/django-money/django-money/pull/349
734 .. _#338: https://github.com/django-money/django-money/issues/338
735 .. _#337: https://github.com/django-money/django-money/issues/337
736 .. _#334: https://github.com/django-money/django-money/issues/334
737 .. _#331: https://github.com/django-money/django-money/issues/331
738 .. _#321: https://github.com/django-money/django-money/issues/321
739 .. _#318: https://github.com/django-money/django-money/issues/318
740 .. _#309: https://github.com/django-money/django-money/issues/309
741 .. _#308: https://github.com/django-money/django-money/issues/308
742 .. _#304: https://github.com/django-money/django-money/issues/304
743 .. _#300: https://github.com/django-money/django-money/issues/300
744 .. _#297: https://github.com/django-money/django-money/issues/297
745 .. _#292: https://github.com/django-money/django-money/issues/292
746 .. _#278: https://github.com/django-money/django-money/issues/278
747 .. _#277: https://github.com/django-money/django-money/issues/277
748 .. _#276: https://github.com/django-money/django-money/issues/276
749 .. _#272: https://github.com/django-money/django-money/issues/272
750 .. _#268: https://github.com/django-money/django-money/issues/268
751 .. _#265: https://github.com/django-money/django-money/issues/265
752 .. _#262: https://github.com/django-money/django-money/issues/262
753 .. _#260: https://github.com/django-money/django-money/issues/260
754 .. _#258: https://github.com/django-money/django-money/issues/258
755 .. _#257: https://github.com/django-money/django-money/pull/257
756 .. _#251: https://github.com/django-money/django-money/pull/251
757 .. _#249: https://github.com/django-money/django-money/pull/249
758 .. _#241: https://github.com/django-money/django-money/issues/241
759 .. _#235: https://github.com/django-money/django-money/issues/235
760 .. _#232: https://github.com/django-money/django-money/issues/232
761 .. _#225: https://github.com/django-money/django-money/issues/225
762 .. _#221: https://github.com/django-money/django-money/issues/221
763 .. _#220: https://github.com/django-money/django-money/issues/220
764 .. _#215: https://github.com/django-money/django-money/issues/215
765 .. _#213: https://github.com/django-money/django-money/issues/213
766 .. _#211: https://github.com/django-money/django-money/issues/211
767 .. _#206: https://github.com/django-money/django-money/issues/206
768 .. _#201: https://github.com/django-money/django-money/issues/201
769 .. _#199: https://github.com/django-money/django-money/issues/199
770 .. _#198: https://github.com/django-money/django-money/issues/198
771 .. _#196: https://github.com/django-money/django-money/issues/196
772 .. _#195: https://github.com/django-money/django-money/issues/195
773 .. _#194: https://github.com/django-money/django-money/issues/194
774 .. _#186: https://github.com/django-money/django-money/issues/186
775 .. _#184: https://github.com/django-money/django-money/issues/184
776 .. _#179: https://github.com/django-money/django-money/issues/179
777 .. _#174: https://github.com/django-money/django-money/issues/174
778 .. _#173: https://github.com/django-money/django-money/issues/173
779 .. _#172: https://github.com/django-money/django-money/issues/172
780 .. _#166: https://github.com/django-money/django-money/issues/166
781 .. _#154: https://github.com/django-money/django-money/issues/154
782 .. _#149: https://github.com/django-money/django-money/issues/149
783 .. _#139: https://github.com/django-money/django-money/issues/139
784 .. _#138: https://github.com/django-money/django-money/issues/138
785 .. _#103: https://github.com/django-money/django-money/issues/103
786 .. _#102: https://github.com/django-money/django-money/issues/102
787 .. _#101: https://github.com/django-money/django-money/issues/101
788 .. _#95: https://github.com/django-money/django-money/issues/95
789 .. _#90: https://github.com/django-money/django-money/issues/90
790 .. _#86: https://github.com/django-money/django-money/issues/86
791 .. _#80: https://github.com/django-money/django-money/issues/80
792 .. _#418: https://github.com/django-money/django-money/issues/418
793 .. _#411: https://github.com/django-money/django-money/issues/411
794 .. _#519: https://github.com/django-money/django-money/issues/519
795 .. _#521: https://github.com/django-money/django-money/issues/521
796 .. _#522: https://github.com/django-money/django-money/issues/522
797
798
799 .. _77cc33: https://github.com/77cc33
800 .. _AlexRiina: https://github.com/AlexRiina
801 .. _carvincarl: https://github.com/carvincarl
802 .. _ChessSpider: https://github.com/ChessSpider
803 .. _GheloAce: https://github.com/GheloAce
804 .. _Stranger6667: https://github.com/Stranger6667
805 .. _YAmikep: https://github.com/YAmikep
806 .. _adambregenzer: https://github.com/adambregenzer
807 .. _adi-: https://github.com/adi-
808 .. _akumria: https://github.com/akumria
809 .. _alexhayes: https://github.com/alexhayes
810 .. _andytwoods: https://github.com/andytwoods
811 .. _arthurk: https://github.com/arthurk
812 .. _astutejoe: https://github.com/astutejoe
813 .. _benjaoming: https://github.com/benjaoming
814 .. _briankung: https://github.com/briankung
815 .. _browniebroke: https://github.com/browniebroke
816 .. _butorov: https://github.com/butorov
817 .. _davidstockwell: https://github.com/davidstockwell
818 .. _dekkers: https://github.com/dekkers
819 .. _devlocal: https://github.com/devlocal
820 .. _dnmellen: https://github.com/dnmellen
821 .. _edwinlunando: https://github.com/edwinlunando
822 .. _elcolie: https://github.com/elcolie
823 .. _eriktelepovsky: https://github.com/eriktelepovsky
824 .. _evenicoulddoit: https://github.com/evenicoulddoit
825 .. _f213: https://github.com/f213
826 .. _Formulka: https://github.com/Formulka
827 .. _glarrain: https://github.com/glarrain
828 .. _graik: https://github.com/graik
829 .. _gonzalobf: https://github.com/gonzalobf
830 .. _horpto: https://github.com/horpto
831 .. _inureyes: https://github.com/inureyes
832 .. _ivanchenkodmitry: https://github.com/ivanchenkodmitry
833 .. _jaavii1988: https://github.com/jaavii1988
834 .. _jack-cvr: https://github.com/jack-cvr
835 .. _jakewins: https://github.com/jakewins
836 .. _jonashaag: https://github.com/jonashaag
837 .. _jplehmann: https://github.com/jplehmann
838 .. _kcyeu: https://github.com/kcyeu
839 .. _kjagiello: https://github.com/kjagiello
840 .. _ivirabyan: https://github.com/ivirabyan
841 .. _k8n: https://github.com/k8n
842 .. _lmdsp: https://github.com/lmdsp
843 .. _lieryan: https://github.com/lieryan
844 .. _lobziik: https://github.com/lobziik
845 .. _mattions: https://github.com/mattions
846 .. _mithrilstar: https://github.com/mithrilstar
847 .. _MrFus10n: https://github.com/MrFus10n
848 .. _msgre: https://github.com/msgre
849 .. _mstarostik: https://github.com/mstarostik
850 .. _niklasb: https://github.com/niklasb
851 .. _pjdelport: https://github.com/pjdelport
852 .. _plumdog: https://github.com/plumdog
853 .. _rach: https://github.com/rach
854 .. _rapIsKal: https://github.com/rapIsKal
855 .. _richardowen: https://github.com/richardowen
856 .. _satels: https://github.com/satels
857 .. _sjdines: https://github.com/sjdines
858 .. _snbuchholz: https://github.com/snbuchholz
859 .. _spookylukey: https://github.com/spookylukey
860 .. _stinovlas: https://github.com/stinovlas
861 .. _synotna: https://github.com/synotna
862 .. _tned73: https://github.com/tned73
863 .. _toudi: https://github.com/toudi
864 .. _tsouvarev: https://github.com/tsouvarev
865 .. _yellow-sky: https://github.com/yellow-sky
866 .. _Woile: https://github.com/Woile
867 .. _w00kie: https://github.com/w00kie
868 .. _willhcr: https://github.com/willhcr
869 .. _1337: https://github.com/1337
870 .. _humrochagf: https://github.com/humrochagf
871 .. _washeck: https://github.com/washeck
872 .. _fara: https://github.com/fara
873 .. _wearebasti: https://github.com/wearebasti
874
[end of docs/changes.rst]
</code>
I need you to solve this issue by generating a single patch file that I can apply directly to this repository using git apply. Please respond with a single patch file in the following format.
<patch>
--- a/file.py
+++ b/file.py
@@ -1,27 +1,35 @@
def euclidean(a, b):
- while b:
- a, b = b, a % b
- return a
+ if b == 0:
+ return a
+ return euclidean(b, a % b)
def bresenham(x0, y0, x1, y1):
points = []
dx = abs(x1 - x0)
dy = abs(y1 - y0)
- sx = 1 if x0 < x1 else -1
- sy = 1 if y0 < y1 else -1
- err = dx - dy
+ x, y = x0, y0
+ sx = -1 if x0 > x1 else 1
+ sy = -1 if y0 > y1 else 1
- while True:
- points.append((x0, y0))
- if x0 == x1 and y0 == y1:
- break
- e2 = 2 * err
- if e2 > -dy:
+ if dx > dy:
+ err = dx / 2.0
+ while x != x1:
+ points.append((x, y))
err -= dy
- x0 += sx
- if e2 < dx:
- err += dx
- y0 += sy
+ if err < 0:
+ y += sy
+ err += dx
+ x += sx
+ else:
+ err = dy / 2.0
+ while y != y1:
+ points.append((x, y))
+ err -= dx
+ if err < 0:
+ x += sx
+ err += dy
+ y += sy
+ points.append((x, y))
return points
</patch>
| django-money/django-money | bb222ed619f72a0112aef8c0fd9a4823b1e6e388 | Money should not override `__rsub__` method
As [reported](https://github.com/py-moneyed/py-moneyed/issues/144) in pymoneyed the following test case fails:
```python
from moneyed import Money as PyMoney
from djmoney.money import Money
def test_sub_negative():
total = PyMoney(0, "EUR")
bills = (Money(8, "EUR"), Money(25, "EUR"))
for bill in bills:
total -= bill
assert total == Money(-33, "EUR")
# AssertionError: assert <Money: -17 EUR> == <Money: -33 EUR>
```
This is caused by `djmoney.money.Money` overriding `__rsub__`. | 2021-01-10 12:54:57+00:00 | <patch>
diff --git a/djmoney/money.py b/djmoney/money.py
index 70bea0b..b744c66 100644
--- a/djmoney/money.py
+++ b/djmoney/money.py
@@ -106,7 +106,6 @@ class Money(DefaultMoney):
# we overwrite the 'targets' so the wrong synonyms are called
# Example: we overwrite __add__; __radd__ calls __add__ on DefaultMoney...
__radd__ = __add__
- __rsub__ = __sub__
__rmul__ = __mul__
diff --git a/docs/changes.rst b/docs/changes.rst
index 2643b4d..8c0b272 100644
--- a/docs/changes.rst
+++ b/docs/changes.rst
@@ -15,6 +15,7 @@ Changelog
**Fixed**
- Pin ``pymoneyed<1.0`` as it changed the ``repr`` output of the ``Money`` class.
+- Subtracting ``Money`` from ``moneyed.Money``. Regression, introduced in ``1.2``. `#593`_
`1.2.2`_ - 2020-12-29
---------------------
@@ -694,6 +695,7 @@ wrapping with ``money_manager``.
.. _0.3: https://github.com/django-money/django-money/compare/0.2...0.3
.. _0.2: https://github.com/django-money/django-money/compare/0.2...a6d90348085332a393abb40b86b5dd9505489b04
+.. _#593: https://github.com/django-money/django-money/issues/593
.. _#586: https://github.com/django-money/django-money/issues/586
.. _#585: https://github.com/django-money/django-money/pull/585
.. _#583: https://github.com/django-money/django-money/issues/583
</patch> | diff --git a/tests/test_money.py b/tests/test_money.py
index 65199c4..bb7ea79 100644
--- a/tests/test_money.py
+++ b/tests/test_money.py
@@ -2,7 +2,7 @@ from django.utils.translation import override
import pytest
-from djmoney.money import Money, get_current_locale
+from djmoney.money import DefaultMoney, Money, get_current_locale
def test_repr():
@@ -114,3 +114,12 @@ def test_decimal_places_display_overwrite():
assert str(number) == "$1.23457"
number.decimal_places_display = None
assert str(number) == "$1.23"
+
+
+def test_sub_negative():
+ # See GH-593
+ total = DefaultMoney(0, "EUR")
+ bills = (Money(8, "EUR"), Money(25, "EUR"))
+ for bill in bills:
+ total -= bill
+ assert total == Money(-33, "EUR")
| 0.0 | [
"tests/test_money.py::test_sub_negative"
] | [
"tests/test_money.py::test_repr",
"tests/test_money.py::test_html_safe",
"tests/test_money.py::test_html_unsafe",
"tests/test_money.py::test_default_mul",
"tests/test_money.py::test_default_truediv",
"tests/test_money.py::test_reverse_truediv_fails",
"tests/test_money.py::test_get_current_locale[pl-PL_PL]",
"tests/test_money.py::test_get_current_locale[pl_PL-pl_PL]",
"tests/test_money.py::test_round",
"tests/test_money.py::test_configurable_decimal_number",
"tests/test_money.py::test_localize_decimal_places_default",
"tests/test_money.py::test_localize_decimal_places_overwrite",
"tests/test_money.py::test_localize_decimal_places_both",
"tests/test_money.py::test_add_decimal_places",
"tests/test_money.py::test_add_decimal_places_zero",
"tests/test_money.py::test_mul_decimal_places",
"tests/test_money.py::test_fix_decimal_places",
"tests/test_money.py::test_fix_decimal_places_none",
"tests/test_money.py::test_fix_decimal_places_multiple",
"tests/test_money.py::test_decimal_places_display_overwrite"
] | bb222ed619f72a0112aef8c0fd9a4823b1e6e388 |
|
mkdocs__mkdocs-2290 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | mkdocs/mkdocs | d6bfb1bc6f63d122556865aeee4ded17c536cb03 | "Config.validate keeps 'mdx_configs' values across different instances\n```python\r\nimport mkdocs.c(...TRUNCATED) | 2021-01-19 23:04:35+00:00 | "<patch>\ndiff --git a/mkdocs/config/config_options.py b/mkdocs/config/config_options.py\nindex 5e65(...TRUNCATED) | "diff --git a/mkdocs/tests/config/config_tests.py b/mkdocs/tests/config/config_tests.py\nindex fe24f(...TRUNCATED) | 0.0 | [
"mkdocs/tests/config/config_tests.py::ConfigTests::test_multiple_markdown_config_instances"
] | ["mkdocs/tests/config/config_tests.py::ConfigTests::test_config_option","mkdocs/tests/config/config_(...TRUNCATED) | d6bfb1bc6f63d122556865aeee4ded17c536cb03 |
|
python-visualization__branca-90 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | python-visualization/branca | a872bef69cfef4ab40380686b637c99dee6a8187 | "Expose legend_scaler's max_labels in ColorMap\nHi,\r\n\r\nI see that `legend_scaler` has an option (...TRUNCATED) | 2021-04-11 12:05:08+00:00 | "<patch>\ndiff --git a/branca/colormap.py b/branca/colormap.py\nindex 1544537..8db9583 100644\n--- a(...TRUNCATED) | "diff --git a/tests/test_colormap.py b/tests/test_colormap.py\nindex 51cfbb3..4b9483b 100644\n--- a/(...TRUNCATED) | 0.0 | ["tests/test_colormap.py::test_max_labels_linear[10-expected0]","tests/test_colormap.py::test_max_la(...TRUNCATED) | ["tests/test_colormap.py::test_simple_step","tests/test_colormap.py::test_simple_linear","tests/test(...TRUNCATED) | a872bef69cfef4ab40380686b637c99dee6a8187 |
|
cunla__fakeredis-py-223 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | cunla/fakeredis-py | 96c6c1f6633bee883d460931c30179d7558785be | "All aio.FakeRedis instances share the same server\n**Describe the bug**\r\n\r\nIf I create two clie(...TRUNCATED) | 2023-08-07 15:03:21+00:00 | "<patch>\ndiff --git a/docs/about/changelog.md b/docs/about/changelog.md\nindex 7c3b508..d3af774 100(...TRUNCATED) | "diff --git a/test/test_redis_asyncio.py b/test/test_redis_asyncio.py\nindex d983d2b..df69904 100644(...TRUNCATED) | 0.0 | [
"test/test_redis_asyncio.py::TestInitArgs::test_connection_different_server"
] | ["test/test_redis_asyncio.py::TestInitArgs::test_singleton","test/test_redis_asyncio.py::TestInitArg(...TRUNCATED) | 96c6c1f6633bee883d460931c30179d7558785be |
|
kdunee__pyembeddedfhir-19 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | kdunee/pyembeddedfhir | e397c352453b9b910e807037826ff3c03ca3c413 | Resolve outstanding TODOs related to error handling | 2021-11-09 20:26:40+00:00 | "<patch>\ndiff --git a/HISTORY.rst b/HISTORY.rst\nindex 6220350..3518afd 100755\n--- a/HISTORY.rst\n(...TRUNCATED) | "diff --git a/tests/unit/test_fhir_runner.py b/tests/unit/test_fhir_runner.py\nnew file mode 100644\(...TRUNCATED) | 0.0 | ["tests/unit/test_fhir_runner.py::TestNetworkRemoval::test_network_removal_when_failure","tests/unit(...TRUNCATED) | ["tests/unit/test_fhir_runner.py::TestNetworkRemoval::test_when_success","tests/unit/test_fhir_runne(...TRUNCATED) | e397c352453b9b910e807037826ff3c03ca3c413 |
|
pydantic__pydantic-2228 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | pydantic/pydantic | 78934db63169bf6bc661b2c63f61f996bea5deff | "Impl. of BaseModel with method crashes when Cythonized\n# Bug\r\n\r\nOutput of `python -c \"import (...TRUNCATED) | 2020-12-30 18:17:42+00:00 | "<patch>\ndiff --git a/pydantic/main.py b/pydantic/main.py\n--- a/pydantic/main.py\n+++ b/pydantic/m(...TRUNCATED) | "diff --git a/tests/test_edge_cases.py b/tests/test_edge_cases.py\n--- a/tests/test_edge_cases.py\n+(...TRUNCATED) | 0.0 | [
"tests/test_edge_cases.py::test_cython_function_untouched"
] | ["tests/test_edge_cases.py::test_str_bytes","tests/test_edge_cases.py::test_str_bytes_none","tests/t(...TRUNCATED) | 78934db63169bf6bc661b2c63f61f996bea5deff |
|
facebookresearch__hydra-625 | "You will be provided with a partial code base and an issue statement explaining a problem to resolv(...TRUNCATED) | facebookresearch/hydra | 9544b25a1b40710310a3732a52e373b5ee11c242 | "Fix error message of adding config value without +\n1. Fix defaults override case to match:\r\n```\(...TRUNCATED) | 2020-05-29 18:30:52+00:00 | "<patch>\ndiff --git a/hydra/_internal/config_loader_impl.py b/hydra/_internal/config_loader_impl.py(...TRUNCATED) | "diff --git a/tests/test_config_loader.py b/tests/test_config_loader.py\nindex c92d0cd17f..f6e9ef8f7(...TRUNCATED) | 0.0 | ["tests/test_config_loader.py::test_apply_overrides_to_defaults[adding_without_plus]","tests/test_co(...TRUNCATED) | ["tests/test_config_loader.py::TestConfigLoader::test_load_configuration[file]","tests/test_config_l(...TRUNCATED) | 9544b25a1b40710310a3732a52e373b5ee11c242 |
End of preview. Expand
in Data Studio
README.md exists but content is empty.
- Downloads last month
- 33