- New project [2021-05-04 Tue]
- Two principal aims:
- Analyzing velocity statistics, such as structure functions
- Collaboration with Javier García Vázquez
- Looking for Raman-scattered Balmer wings
- Collaboration with Mabel Valerdi
- Analyzing velocity statistics, such as structure functions
I have made a series of notebooks
- Notebook: 00-00-MPDAF-simple-demo.ipynb
- Pure python: 00-00-MPDAF-simple-demo.py
- Reading in the data cube, summing all waves to get an image, summing all pixels to get a spectrum
- Notebook: 00-01-extract-subregions.ipynb
- Pure python: 00-01-extract-subregions.py
- Extract spectral ranges and spatial regions of the data cube
- Use of masks in 2D and 3D
- Notebook: 01-00-line-ratios.ipynb
- Pure python: 01-00-line-ratios.py
- Mainly concentrating on the [S II] 6716,31 lines for now
- Demonstrates a novel (I believe) technique of compensating for uncertainties in the zero-point, which makes use of the known line ratio in the low-density limit
- Makes use of
pandas
andseaborn
libraries to plot joint distributions of extracted line intensities
- Notebook: 02-00-raman-wings.ipynb
- Pure python: 02-00-raman-wings.py
- Yes, we find the wings in NGC 346
- We have to mask out about 60% of the spatial pixels that are contaminated with stars in order to see it properly
- Makes use of
regions
library to select boxes in the image
- N66 is the H II region, while NGC 346 is technically the star cluster, but the H II region is also sometimes referred to as this
- Data are at SMC-NGC-346/ADP.2017-10-16T11_04_19.247.fits
- Papers on NGC 346
- Valerdi:2019a Helium abundance
- Measures physical conditions too:
- [O III] Te = 13000 K
- ne = 20-30 ([O II] and [S II]) up to 100 ([Fe III])
- Measures physical conditions too:
- Valerdi:2019a Helium abundance
- Looks like this will be possible for the brighter lines at least
- There is a problem that seems to be an overzealous background subtraction, which means that some of the lines come out negative
- We can maybe fix this by adding on the spectrum seen in the darkest corner of the image
- With the NGC 346 cube, it looks like the pipeline sky subtraction tried to use the science image itself to determine the sky
- But this is a bad idea since the H II region fills the FOV of the instrument
- General discussion of sky subtraction in pipeline:
- Sec 3.9.3 of Weilbacher:2020a
- MUSE Pipeline Manual
- External tool to fix up sky subtraction on pipeline-reduced cube
- ZAP – the Zurich Atmosphere Purge
- Soto:2016n
- https://github.com/musevlt/zap
- Unfortunately, this seems to require the use of a blank exposure of adjacent sky in cases like ours, and I don’t think that is available
- ZAP – the Zurich Atmosphere Purge
Region | Q(H) | R (pc) | n (pcc) | (Q n)1/3 | U (IR) | Q / 4 π R^2 | ff |
log10 | pc | pcc | Hab | (1e9 /cm^2/s) | |||
---|---|---|---|---|---|---|---|
Orion | 49.0 | 0.25 | 4000 | 15.9 | ? | 2089.44 | 2.442 |
N4 | 49.2 | 10.2 | 500 | 9.3 | 740 | 1.27 | 0.002 |
N11 | 51.0 | 145 | 50 | 17.1 | 230 | 0.40 | 0.004 |
N30 | 49.7 | 45.1 | 60 | 6.7 | 250 | 0.21 | 0.005 |
N44 | 50.6 | 103 | 60 | 13.4 | 230 | 0.31 | 0.003 |
N48 | 49.9 | 75.6 | 50 | 7.4 | 140 | 0.12 | 0.002 |
N55 | 50.0 | 52.4 | 50 | 7.9 | 200 | 0.30 | 0.009 |
N59 | 50.5 | 56.7 | 120 | 15.6 | 400 | 0.82 | 0.004 |
N79 | 50.2 | 64.0 | 80 | 10.8 | 320 | 0.32 | 0.003 |
N105 | 50.1 | 42.2 | 130 | 11.8 | 340 | 0.59 | 0.003 |
N119 | 50.5 | 85.8 | 60 | 12.4 | 200 | 0.36 | 0.004 |
N144 | 50.4 | 71.3 | 70 | 12.1 | 270 | 0.41 | 0.004 |
N157 (30 Dor) | 51.7 | 98.9 | 250 | 50.0 | 860 | 4.28 | 0.003 |
N160 | 51.0 | 40.0 | 120 | 22.9 | 380 | 5.22 | 0.034 |
N180 | 50.1 | 39.3 | 120 | 11.5 | 230 | 0.68 | 0.005 |
N191 | 49.0 | 30.5 | 50 | 3.7 | 500 | 0.09 | 0.004 |
N206 | 50.5 | 112 | 50 | 11.6 | 140 | 0.21 | 0.003 |
DEM S74 | 49.1 | 47.9 | 30 | 3.4 | 40 | 0.05 | 0.004 |
N13 | 49.0 | 8.87 | 260 | 6.4 | 280 | 1.06 | 0.007 |
N17 | 49.1 | 26.6 | 70 | 4.5 | 120 | 0.15 | 0.004 |
N19 | 48.8 | 12.4 | 160 | 4.7 | 140 | 0.34 | 0.004 |
N22 | 49.1 | 16.0 | 160 | 5.9 | 740 | 0.41 | 0.004 |
N36 | 49.9 | 44.4 | 60 | 7.8 | 80 | 0.34 | 0.008 |
N50 | 49.8 | 76.3 | 20 | 5.0 | 50 | 0.09 | 0.011 |
N51 | 48.8 | 33.7 | 30 | 2.7 | 140 | 0.05 | 0.006 |
N63 | 49.0 | 23.1 | 60 | 3.9 | 90 | 0.16 | 0.007 |
N66 | 50.6 | 63.9 | 100 | 15.8 | 380 | 0.81 | 0.005 |
N71 | 48.2 | 3.55 | 330 | 3.7 | 240 | 1.05 | 0.010 |
N76 | 50.0 | 55.0 | 70 | 8.9 | 130 | 0.28 | 0.004 |
N78 | 49.7 | 46.1 | 70 | 7.1 | 570 | 0.20 | 0.003 |
N80 | 49.4 | 39.0 | 50 | 5.0 | 90 | 0.14 | 0.005 |
N84 | 50.2 | 101 | 30 | 7.8 | 160 | 0.13 | 0.005 |
N90 | 49.5 | 30.2 | 50 | 5.4 | 110 | 0.29 | 0.014 |
- [2022-04-28 Thu] What I have at the moment with the git subtrees is a bit of a mess
- For instance, extract.py uses the tetrabloks library
- Previously it used a relative import, but that only worked when it looked like they were both in the same package
- But now that I want to make my own whispy package, it is not going to work
- For instance, extract.py uses the tetrabloks library
- Ideally, I should just have dependencies on the other packages
- tetrabloks is the only one that is used by whispy
- but cloudytab and wcsfile are used by some of the jupyter notebooks
- Plan of action:
- [X] I am going to make a new project for whispy, which will probably use the Hypermodern Python template
- It can just use tetrabloks as a dependency, using pip to install directly form github
- But it seems that the repo name is actually multibin-maps
- So why is it called tetrabloks here? Why did I not document this?
- [X] I can also install tetrabloks and the other libraries (wcsfile and cloudytab) directly in my default conda environment
- doing that now in py39 environment
pip install git+https://github.com/will-henney/whispy.git
- This seemed to work, and it pulled in the tetrabloks package too
- It also upgraded a whole bunch of my other packages, such as numpy and astropy. Hopefully that will not cause me problems.
- Also, pip called out some inconsistencies it had failed to relove, so I did
pip install "pillow>=8.3.2" mimeparse
and everything seems to be OK
- [X] So I tried it out in the jupyter console, and I could import them fine
- [X] Test using these versions in the notebook
- Tested all except the sky line gaussian fitting, since that will be more involved. But what I have tested is enough to show that the package import mechanism and basic funtionality does work.
- doing that now in py39 environment
- [X] If that all works, then I can remove the git submodules entirely
- [-] Finally, I should really make a special virtual environment for this project that automatically pulls in the needed dependencies
- [X] First step towards this is to make a requirements.txt
- I made this using pigar, which looks for all imports in a project, including jupyter notebooks
- I installed it with pipx
- It found most of the libraries, but I had to make some adjustments by hand
- [ ] Then I have to decide how to make a virtual environment and test it
- [X] First step towards this is to make a requirements.txt
- [X] I am going to make a new project for whispy, which will probably use the Hypermodern Python template
- This now seems to have been a bad idea on the whole, at least for tetrabloks
- I tried pushing the tetrabloks subrepo back to the upstream multibin-maps repo on github, but that did not work. I can make a new branch there to receive it, but the history has nothing in common with the actual repo, so I can’t do a pull request to merge them
- On the other hand, everything worked out fine when I did the same with the cloudytab repo
- Conclusion is that I must have messed something up with multibin-maps, either when I moved it to the lib folder, or when I renamed it to tetrabloks
- What I plan to do to fix things is to remove the subtree and then incorporate the changes made directly into the original multibin-maps repo again
- After that, we can add it as a dependency and let pip, poetry or whatever deal with installing it from github
- This was very easy to do with two command using magit
- I added the github repo for multibin-maps as a remote with
M a
- Corresponding raw git command
git … remote add -f multibin-maps https://github.com/will-henney/multibin-maps.git
- Corresponding raw git command
- Then I added it as a subtree with
O i a
- Corresponding raw git command
git … subtree add --prefix=multibin-maps multibin-maps HEAD
- Corresponding raw git command
- This made a new commit to this repo for adding the subtree
- And it also seems to have pulled in all the history from the subtree too
- Note that I use the same name
multibin-maps
for- The name of the remote
- The prefix (subfolder name)
- There is no need to do anything unless we want to update to a newer version
- In magit use the following commands:
O i f
to pull new version from upstreamO e p
to push any local changes back to the upstream multibin-maps repo on github- But probably better to just do edits in the upstream repo and then pull them
- I did the same as above, except that I put the subtree prefix to
lib/cloudytab
- Also, I had already moved multibin-maps to the
lib
folder earlier
- Also, I had already moved multibin-maps to the
- So the corresponding raw git commands (from
M a
andO i a
) were:0 git … remote add -f cloudytab https\://github.com/div-B-equals-0/cloudytab.git Updating cloudytab Unpacking objects: 100% (14/14), 3.81 KiB | 300.00 KiB/s, done. From https://github.com/div-B-equals-0/cloudytab * [new branch] main -> cloudytab/main 0 git … subtree add --prefix\=lib/cloudytab cloudytab HEAD git fetch cloudytab HEAD From https://github.com/div-B-equals-0/cloudytab * branch HEAD -> FETCH_HEAD Added dir 'lib/cloudytab'
- [2024-03-13 Wed] I deleted this since it is now on pip-installable, which is a much better solution
- I got rid of the folder from the command line with
git rm -rf lib/cloudytab
- Then I ditched the remote with
M k
in magit
- I got rid of the folder from the command line with
- Note that most of this is not necessary most of the time
- Just make changes to whatever version you like:
.md
or.py
in an editor, or.ipynb
in the browser - Then the other formats should get automagically updated too
- Just make changes to whatever version you like:
- I use jupytext to automatically save
.py
and.md
versions of the notebooks- Any of the versions can be edited, and then the others can be synced to the last-modified one:
jupytext --sync FILE
Extract from jupytext help screen:
--sync, -s Synchronize the content of the paired representations of the given notebook. Input cells are taken from the file that was last modified, and outputs are read from the ipynb file, if present. (default: False)
- According to the jupytext FAQ, I can just edit the
.md
or.py
version while the notebook is still open in the browser. It will notice that something has changed and offer me the chance to reload it- [X] I need to try this out
- Yes, it actually works. It is better to just force-reload with
Cmd-R
immediately that you switch back to the browser window
- Yes, it actually works. It is better to just force-reload with
- [X] I need to try this out
- Any of the versions can be edited, and then the others can be synced to the last-modified one:
- We can further convert to a
.org
file using pandoc. For instance:pandoc -o 01-extract-subregions.org 01-extract-subregions.md
- In principle, we could edit the org file and then use pandoc to send it back to md, and then jupytext again to sync up the ipynb version.
- But I haven’t tried this yet
- I don’t know whether the jupytext metadata will survive
- A quick test shows that this metadata doesn’t get written to the
.org
file by pandoc
- A quick test shows that this metadata doesn’t get written to the
- In principle, we could edit the org file and then use pandoc to send it back to md, and then jupytext again to sync up the ipynb version.
- I have added type hints to the functions in lib/extract.py
- [ ] I should really write a bunch of tests too
- Internal consistency can be checked with mypy:
mypy --namespace-packages -m extract