Tuesday, December 14, 2010

Reconstruction, @#$% Yeah!

Oh Alexia you are a genius! Check it out everyone, this reconstruction looks pretty darn good, and it is on the SMALL data set, can't wait to see how the big one looks. Boo ya!

It's 2am, so I am not going to bother describing the details... more to follow.... cyan is data, red is reconstruction.



Making Randoms within Mask

I spent the weekend trying to make random catalogs inside the masks. They are taking forever to run, so I probably need to optimize better. I didn't spend too much time doing this before because I figured we would only need to make the random catalogs once and then we could just use them over and over again.

However, I've been running the code to generate them for > 13 hours and they still haven't finished, so there is a problem somewhere. The issue is that the mask represents a small region of the sky, but I am generating random objects over the whole sky, so 99.99% of the objects are rejected and thus the code is very inefficient.

Update: I found a bug in this code while trying to optimize explains why it was taking so long to run. Here is some testing about how it scales with number of randoms generated:

photo randoms:
500 randoms
Mon Dec 13 22:28:44 2010
Mon Dec 13 22:29:13 2010
29 seconds

5000 randoms
Mon Dec 13 22:34:00 2010
Mon Dec 13 22:31:02 2010
~3 minutes

50,000 randoms
Mon Dec 13 22:36:04 2010
Mon Dec 13 23:02:00 2010
~28 minutes

spectro randoms:
500 randoms
Mon Dec 13 22:32:10 2010
Mon Dec 13 22:31:45 2010
25 seconds

5000 randoms
Mon Dec 13 22:37:06 2010
Mon Dec 13 22:34:38 2010
~2.5 minutes

50,000 randoms
Mon Dec 13 22:38:51 2010
Mon Dec 13 23:01:42 2010
~23 minutes

The way the code works in it's current (inefficient) form:

For Spectroscopic Data:
1) Read in the two masks.
2) Create random ra and dec that are 5X as big as the spectro data set
3) Assign a redshift from the redshifts of the spectro data set (to keep same redshift distribution)
4) See if ra/dec is in the both masks
5) If it is in both masks, look at the weight in the BOSS mask. Assign a random weight to the object (between 0 and 1), if it is less than or equal to the weight of the BOSS mask then it is "observed", if it is greater than the BOSS weight it "isn't observed"
6) Repeat above until you reached the right number of randoms

For Photometric Data:
1) Read in the two masks.
2) Create random ra and dec that are 5X as big as the photo data set
3) See if ra/dec is in the both masks
4) Repeat above until you reached the right number of randoms

We don't need to assign a redshift to the photo data or "observe or not observe" because the photometric data doesn't have weights.

The code is here:

../pcode/photorandoms.pro
../pcode/spectrorandoms.pro

The code to run this as a qsub is here:
../pcode/makeRphoto.script
../pcode/makeRspec.script

You run it with the following code:
qsub makeR[photo,spec].script

The randoms have the same distribution as the data, shown in the following histograms, white is the data, red is the randoms:



Sweet!

Monday, December 13, 2010

No Overlap Data

I realized that I need to make sure that the photometric and spectroscopic data doesn't overlap. I've written code to do this here:

../pcode/removeDataOverlap.pro
../pcode/getSpectroPhotoData.pro

And the overlapping data is here:
/home/jessica/boss/spectroRaDecZMaskedNoOverLap.dat
/home/jessica/boss/photoRaDecZMaskedNoOverLap.dat

This is what we should use for the reconstruction.

Friday, December 10, 2010

Using the Whole Footprint with Masks

For the past few days I've been working on using masks to run correlation functions on the whole BOSS footprint. It's required learning how to use polygon masks and also to create randoms in these masks. Here is the code/testing:

To mask BOSS data (spectroscopic set)

;Read in BOSS data
bossgalaxies = '/home/jessica/boss/spAll-v5_4_14.fits'
spall = mrdfits(bossgalaxies, 1)

;Trim BOSS data to be a galaxy (remove QSOs, starts, etc)
isgal = where(spall.specprimary EQ 1 $
AND (spall.boss_target1 AND 2L^0+2L^1+2L^2+2L^3+2L^7) NE 0 $
AND strmatch(spall.class,'GALAXY*') $
AND spall.zwarning EQ 0)

;Select galaxies
galaxies = spall[isgal]
ra = galaxies.ra
dec = galaxies.dec
z = galaxies.z

;Select objects in the masks
weights = maskdataboth(ra,dec,z)

; Readout ra, dec, redshift of spectroscopic objects inside mask
thisfile = '/home/jessica/boss/spectroRaDecZMasked.dat'
writecol,thisfile,ra,dec,z

The code for maskdataboth is here:
/home/jessica/repository/ccpzcalib/Jessica/pythonJess/maskdataboth.pro

'maskdataboth' filters the data through both the BOSS mask and Shirley's mask to make sure the data sets are in the same ra/dec footprint.


To Mask Shirley's Data (photometric set)

;Read in Shirley's Photometric Data
file1 = "/home/shirleyho/research/SDSS_PK/powerspec_dir/DATA/ra_dec_z_run_rerun_camcol.z_gt_0.01"
readcol, file1, pra, pdec, pz, RUN, rerun, camcol, x, x, x, x, x, x, x, x, format='(F,F,F,D,D,D,D,D,F,F,F,F,F,F)'

;Select objects in the masks
pweights = maskdataboth(pra,pdec,pz)

; Readout ra, dec, redshift of spectroscopic objects inside mask
thisfile = '/home/jessica/boss/photoRaDecZMasked.dat'
writecol,thisfile,pra,pdec,pz

Here are plots of the data sets to show that they do indeed fall in the same region of the sky:




Both these data sets are saved in the following directory: /home/jessica/boss/
The code to run this is here: ../logs/101210log.pro

Thursday, December 9, 2010

Wednesday, December 8, 2010

Trimming Reconstruction

When looking over my xi and wps matrices used in the reconstruction, Alexia noticed that some fo the numbers were all over the place. She suggested changing the range of the correlation functions I use to those which have enough data to have reasonable number. So I spent yesterday creating the following code to "trim" the matrices to whatever ranges the user wants. Here is how it works:

The wpsMat is a concatenation of all the wps(r,theta) correlation functions at the different redshift (r) bins. However, sometimes there aren't enough objects at certain redshift bins, so the values in wps for that bin are nonsensical. Likewise there might angles (theta) that are too big or small. The followincg code allows you to decide how many of the r and theta bins to skip from the start and end of the data set:

skipStartR = 2 # number of r bins to skip at start
skipEndR = 1 # number of r bins to skip at end
skipR = skipStartR + skipEndR #total r bins skiped
skipStartT = 0 # number of theta bins to skip at start
skipEndT = 0 # number of theta bings to skip at end
#Make matrix with skipping:
makeWPSmatrix4(workingDir,rbins, corrBins, wpsMatrixFile, skipStartR, skipEndR, skipStartT, skipEndT)

Similarly, we can want to skip the same r bins in the xi matrix, and also we can skip bins along the line of sight (l):

#Create a matrix to input the correlation function data
# Write Xi Matrix to file
skipStartL = 0 # number of line of sight bins to skip at start
skipEndL = 0 # number of line of sight bins to skip at end
makeXimatrix4(workingDir,rbins,nxicorr,xiMatrixFile, skipStartR, skipEndR, skipStartL, skipEndL)

This doesn't appear to be helping the reconstruction. Here is reconstruction without trimming:

Reconstruction without trimming

Construction with trimming:

The code to make these plots is here: ../logs/101208log.py

Tuesday, December 7, 2010

Dying Reconstruction

Yesterday I had the problem that the reconstruction code was working fine on my laptop, but when I tried to run it on riemann I kept getting a segmentation fault. I turned out it was the same problem as this post. Oh thank you blog! Now the reconstruction runs on riemann as well as my lap top.

Monday, December 6, 2010

Reconstruction Code for Data

I spent today getting the reconstruction code in shape to do it on the stripe 82 data. It is in the following file: ../pythonjess/jessreconstructionStripe82DataFull.py

I've done the data run in this run directory:
../pcode/run2010128_21

Results to come.

Friday, December 3, 2010

BOSS Galaxies

Spent today getting the BOSS Galaxies in the correct form to use as my spectroscopic data set. Here's what I did:

1) Downloaded latest spAll file from here to ~/boss/:
:/clusterfs/riemann/
raid006/bosswork/groups/boss/spectro/redux/spAll-v5_4_14.fits

2) In IDL, trim the data to be galaxies in stripe 82:
bossgalaxies = '/home/jessica/boss/spAll-v5_4_14.fits'
spall = mrdfits(bossgalaxies, 1)

stripe82gal = where(((spall.ra LE 60.0) OR (spall.ra GE 300.)) $
and ((spall.dec GE -1.25) and (spall.dec LE 1.25)) $
AND spall.specprimary EQ 1 $
AND (spall.boss_target1 AND 2L^0+2L^1+2L^2+2L^3+2L^7) NE 0 $
AND strmatch(spall.class,'GALAXY*') $
AND spall.zwarning EQ 0

3) Write these galaxies out to a col delimited file that python can read:
thisfile = '/home/jessica/boss/BOSSstripe82RaDec.dat'
writecol, thisfile, spall[stripe82gal].ra, spall[stripe82gal].dec, spall[stripe82gal].z

4) Do the same to Shirley Ho's photometric LRG file:
shirleydata = "/home/shirleyho/research/SDSS_PK/powerspec_dir/DATA/ra_dec_z_run_rerun_camcol.z_gt_0.01"
readcol, shirleydata, RA, DEC, z, RUN, rerun, camcol, x, x, x, x, x, x, x, x, format='(F,F,F,D,D,D,D,D,F,F,F,F,F,F)'
datasize = n_elements(ra)

;Data just within stripe 82
stripe82all = where(((ra LE 60.0) OR (ra GT 305.)) $
and ((dec GE -1.25) and (dec LE 1.25)))

; Readout ra, dec, redshift of stripe 82 Shirly objects
thisfile = '/home/jessica/boss/ShirlyDataRaDec.dat'
writecol, thisfile, ra[stripe82all], dec[stripe82all], z[stripe82all]

If I want to use the whole BOSS Galaxy footprint I have a mask here from Martin White:
/home/jessica/boss/bossX.002.ply

You can read this file like so:

infile = "./bossX.002.ply"
read_mangle_polygons, infile, polygons, id

There are more functions in idlutils in playing with the mask.

Now I am trying to do the reconstruction on these two data sets. Using BOSS as the spectroscopic set and Shirley's LRG galaxies as the spectroscopic galaxies.

Here are some plots of the two sets of data:








The log file to make these data files and plots is here:
../logs/101207log.pro
../logs/101207log.py

Thursday, December 2, 2010

At Los Alamos

I am visiting Alexia at Los Alamos National Lab for the next three weeks. Los Alamos is beautiful, but COLD.

I'm putting the final touches on the data-correlation function setup.

I noticed that in the 3D auto-correlation, I am producing two different random catalogs and using the following calculation:

(dd -dr1 -r2d +r1r2)/(r1r2)

This might be extra computing power than necessary if it is ok for r1 = r2... but I forget if they need to be different... perhaps they do need to be different in order for this to work. It seems like this is something I would have thought of before.

I talked to Alexia about this and she thinks it shouldn't make a difference either way. So I am going to set r1 = r2 and see what happens....

Old Autocorrelation Function results (without randoms fed in)
# r (Mpc/h) cross xi (r)
# r (Mpc/h) cross xi (r)
1.0009750000 113.0999999998
3.0009250000 4.3312368973
5.0008750000 0.1890557940
7.0008250000 0.3768587361
9.0007750000 0.4359240581
11.0007250000 0.1349752029
13.0006750000 0.1273595140
15.0006250000 -0.1351841518
17.0005750000 0.2378351539
19.0005250000 0.2994336370
21.0004750000 -0.0479553597
23.0004250000 -0.0998846930
25.0003750000 -0.1438266200
27.0003250000 -0.3070417433
29.0002750000 -0.3108741254
31.0002250000 -0.1177777522
33.0001750000 -0.0633376228
35.0001250000 -0.2604527820
37.0000750000 -0.2517495550
39.0000250000 -0.0304629134


New Autocorrelation Function results (with randoms fed in)
I think small differences are rounding errors
# Reached checkpoint 4
# r (Mpc/h) cross xi (r)
1.0009750000 113.0999999998
3.0009250000 4.3294918806
5.0008750000 0.1890963726
7.0008250000 0.3766263941
9.0007750000 0.4359240581
11.0007250000 0.1349752029
13.0006750000 0.1273650408
15.0006250000 -0.1355281973
17.0005750000 0.2380827199
19.0005250000 0.2995001354
21.0004750000 -0.0479749624
23.0004250000 -0.0998864926
25.0003750000 -0.1437953465
27.0003250000 -0.3070974753
29.0002750000 -0.3107744869
31.0002250000 -0.1177689518
33.0001750000 -0.0632545877
35.0001250000 -0.2605975485
37.0000750000 -0.2517029624
39.0000250000 -0.0304363094

New Autocorrelation Function with r1 = r2 (with randoms fed in)
There are pretty big difference between these two, but this could just be because my data set is small. Should check on a larger set.
# Reached checkpoint 4
# r (Mpc/h) cross xi (r)
1.0009750000 120.9999999998
3.0009250000 3.9579487179
5.0008750000 0.1234024722
7.0008250000 0.3826803206
9.0007750000 0.4731333382
11.0007250000 0.0871628790
13.0006750000 0.0808080808
15.0006250000 -0.1168749344
17.0005750000 0.2334021835
19.0005250000 0.3148748159
21.0004750000 -0.0339674194
23.0004250000 -0.0762365727
25.0003750000 -0.1365716955
27.0003250000 -0.2876355988
29.0002750000 -0.3209356312
31.0002250000 -0.1164599148
33.0001750000 -0.0517181534
35.0001250000 -0.2402264337
37.0000750000 -0.2732306595
39.0000250000 -0.0168706759

Thursday, November 11, 2010

Modifying 3D Correlation Function

I'm changing the 3D correlation function code to accept data and randoms (similar to what I just did for the 2D correlation function code).

This is proving to be a little tricky because in the current version I do some tricks to make sure the periodic boundary conditions don't add power incorrectly, and I also re-scale the data because the correlation function is expecting it to be in a 1X1X1 box.

After talking with Alexia, I've decided to do these conversions outside of the correlation function in python, such that the code expects the data to be in a 1x1x1 box.

Results to come soon.....

Wednesday, November 10, 2010

Matching Results!

It turned out I had was printing out the same random catalog twice from the old code! So now everything matches!

Old:
# theta (degrees) crossw(theta)
1.1250000000 9.9488993345
1.3750000000 2.3509358289
1.6250000000 1.6241751936
1.8750000000 1.2341323430
2.1250000000 1.1220636000
2.3750000000 0.9348225839
2.6250000000 0.8468507873
2.8750000000 0.8237087312
3.1250000000 0.8176693767
3.3750000000 0.7284754936
3.6250000000 0.7714957667
3.8750000000 0.7788989999
4.1250000000 0.8089800491
4.3750000000 0.7564553032
4.6250000000 0.6803548995
4.8750000000 0.6352504229
5.1250000000 0.6443996748
5.3750000000 0.5349427778
5.6250000000 0.5059338935
5.8750000000 0.4477430500

New:
# theta (degrees) crossw(theta)
1.1250000000 9.9488993345
1.3750000000 2.3509358289
1.6250000000 1.6241751936
1.8750000000 1.2341323430
2.1250000000 1.1220636000
2.3750000000 0.9348225839
2.6250000000 0.8468507873
2.8750000000 0.8237087312
3.1250000000 0.8176693767
3.3750000000 0.7284754936
3.6250000000 0.7714957667
3.8750000000 0.7788989999
4.1250000000 0.8089800491
4.3750000000 0.7564553032
4.6250000000 0.6803548995
4.8750000000 0.6352504229
5.1250000000 0.6443996748
5.3750000000 0.5349427778
5.6250000000 0.5059338935
5.8750000000 0.4477430500

Tuesday, November 9, 2010

WPS Details Compairison

Old Code:
bin dd dr rd rr
0 43185.000000 2336.000000 2352.600000 5044.200000
1 4494.000000 1037.400000 1059.200000 1966.600000
2 3790.000000 1210.200000 1214.800000 2273.400000
3 3392.000000 1400.200000 1389.800000 2550.000000
4 3556.000000 1594.000000 1681.000000 2794.000000
5 3506.000000 1823.200000 1922.600000 3108.000000
6 3564.000000 2001.400000 2054.600000 3351.400000
7 3852.000000 2182.000000 2314.000000 3557.200000
8 4056.000000 2389.400000 2425.200000 3827.600000
9 4050.000000 2566.000000 2612.400000 4035.600000
10 4614.000000 2760.600000 2791.800000 4375.000000
11 4968.000000 2998.600000 3088.800000 4575.200000
12 5392.000000 3139.600000 3286.800000 4800.400000
13 5542.000000 3347.000000 3566.800000 5038.600000
14 5544.000000 3551.600000 3687.600000 5257.200000
15 5600.000000 3746.600000 3873.400000 5582.400000
16 5946.000000 3924.800000 4025.800000 5795.600000
17 5452.000000 4065.800000 4100.200000 6035.400000
18 5464.000000 4235.600000 4345.200000 6207.400000
19 5412.000000 4380.200000 4527.400000 6357.600000

New:
0 43185.000000 2352.600000 2352.600000 5442.800000
1 4494.000000 1059.200000 1059.200000 2048.600000
2 3790.000000 1214.800000 1214.800000 2347.800000
3 3392.000000 1389.800000 1389.800000 2611.800000
4 3556.000000 1681.000000 1681.000000 2922.400000
5 3506.000000 1922.600000 1922.600000 3123.200000
6 3564.000000 2054.600000 2054.600000 3369.000000
7 3852.000000 2314.000000 2314.000000 3661.200000
8 4056.000000 2425.200000 2425.200000 3896.600000
9 4050.000000 2612.400000 2612.400000 4160.200000
10 4614.000000 2791.800000 2791.800000 4406.000000
11 4968.000000 3088.800000 3088.800000 4704.600000
12 5392.000000 3286.800000 3286.800000 4951.800000
13 5542.000000 3566.800000 3566.800000 5215.400000
14 5544.000000 3687.600000 3687.600000 5396.000000
15 5600.000000 3873.400000 3873.400000 5598.200000
16 5946.000000 4025.800000 4025.800000 5856.000000
17 5452.000000 4100.200000 4100.200000 5987.400000
18 5464.000000 4345.200000 4345.200000 6225.600000
19 5412.000000 4527.400000 4527.400000 6421.800000

Looks like DD and RD match between old and new.... and in the new the DR and RD are the same, so I must be pointing to the same random file twice in the new code, better check that!

Monday, November 8, 2010

Comparing Code

I've got the old correlation function code (where it creates the randoms inside the code) spitting out the randoms it is using to a file, and I am then comparing the correlation function to my new code using these random files. Here are the two correlation function values:

New code:
1.1250000000 8.0698537517
1.3750000000 2.1596212047
1.6250000000 1.5794360678
1.8750000000 1.2344743089
2.1250000000 1.0663837941
2.3750000000 0.8913934426
2.6250000000 0.8381715643
2.8750000000 0.7880476347
3.1250000000 0.7961299595
3.3750000000 0.7176097303
3.6250000000 0.7799364503
3.8750000000 0.7428899375
4.1250000000 0.7613797003
4.3750000000 0.6948268589
4.6250000000 0.6606375093
4.8750000000 0.6165195956
5.1250000000 0.6404371585
5.3750000000 0.5409693690
5.6250000000 0.4817527628
5.8750000000 0.4327447133

Old Code:
1.1250000000 8.6318147575
1.3750000000 2.2190582732
1.6250000000 1.6004222750
1.8750000000 1.2360784314
2.1250000000 1.1005726557
2.3750000000 0.9228442728
2.6250000000 0.8531956794
2.8750000000 0.8189587316
3.1250000000 0.8018079214
3.3750000000 0.7203885420
3.6250000000 0.7855085714
3.8750000000 0.7553331002
4.1250000000 0.7845179568
4.3750000000 0.7277418330
4.6250000000 0.6775469832
4.8750000000 0.6381484666
5.1250000000 0.6541169163
5.3750000000 0.5503197800
5.6250000000 0.4978896156
5.8750000000 0.4501698754

The numbers are close, but not exactly the same... they should be exactly the same I think. :( I'm looking deeper into the calculation to see where the difference is exactly.

Tuesday, November 2, 2010

Masking both Data and Randoms

I've masked both the data and the randoms now:


The randoms (magenta) now fall exactly on top of the data (cyan). The code to do this is as follows:

;Read in data
file1 = "/home/shirleyho/research/SDSS_PK/powerspec_dir/DATA/ra_dec_z_run_rerun_camcol.z_gt_0.01"

readcol, file1, RA, DEC, z, RUN, rerun, camcol, x, x, x, x, x, x, x, x, format='(F,F,F,D,D,D,D,D,F,F,F,F,F,F)'

datasize = n_elements(ra)

;Read in mask
file = "/home/shirleyho/research/SDSS_PK/powerspec_dir/MASK_SDSS/dir_fits/tmp_unified_score_ebv_psf_bsm_nside1024.fits"

read_fits_map, file, weight, nside=nside_in, ordering=order


index = 0L
;Convert ra/dec to phi/theta expected in HEALPix
dataphi = ra*!pi/180.
datatheta = (90.0-dec)*!pi/180.
;Will be filled in with weight corresponding to ra/dec of data
dataweight = findgen(datasize)*0.0

;Function I wrote to get the weight for the data
junk = maskdata(datatheta, dataphi, index, datasize, nside_in, weight, dataweight)
;Only keeps data with a weight greater than 0.2 (inside mask)
datacut = where(dataweight GE 0.2)

;Make the randoms
randomsize = datasize*5L
randomtheta = findgen(randomsize)*0.0
randomphi = findgen(randomsize)*0.0
index = 0L
;Function I wrote to fill in randomtheta and randomphi with points inside the mask
junk = makemaskrandoms(randomtheta, randomphi, index, randomsize, nside_in, weight)

;Plot the data and the randoms
plot, ra[sub], dec[sub], psym=3, symsize=2, xr=[0,360],yr=[0,90], XTITLE = "RA", YTITLE = "DEC", TITLE = "RA vs Dec of Data and Randoms", charsize = 2, charthick = 1, thick = 2, xthick=2, ythick=2
oplot, ra[datacut], dec[datacut], psym=3, color=fsc_color('cyan')
oplot, randomra[0:100000], randomdec[0:100000], psy = 3, color=fsc_color('magenta')

Now to compute some correlation functions!

Monday, November 1, 2010

Plot of Data / Randoms

Here is a plot of the data (cyan) and randoms (magenta) using the masks, and the method outlined in this post:


I'm a little confused as to why some of the data isn't represented in the randoms. It seems the mask is different than the data, perhaps this is because there is less completeness in the regions that aren't covered in the mask. I'll ask Shirley about this, but in the mean time, I'll apply the mask to both the data and the randoms, to insure the area covered are the same.


11/2 Update: Heard back from Shirley she says:

"This is deliberate. And your plot reflects it :)

There is a reason to cut out certain areas in the mask, and I never said that the data covers the same area as the mask :) ! I am testing things right now for different areas to cut out :) ! (due to extinction, stars and possible photometric calibration problems)"

So I looks like I should just apply the mask to both the data and the randoms before doing the cross correlation, as I suspected.

Sunday, October 31, 2010

Happy Halloween!

Happy Halloween Everyone!
Jessica as 'Hit Girl'

Monday, October 25, 2010

Code for Random Catalog with Mask

From Wolfram Math World:


To pick a random point on the surface of a sphere, it is incorrect to select spherical coordinates θ and φ from uniform distributions θ ∈ [0,2π) and φ ∈ [0,π], since the area element dΩ=sinφdθdφ is a function of φ, and hence points picked in this way will be "bunched" near the poles (left figure above).

To obtain points such that any small area on the sphere is expected to contain the same number of points (right figure above), choose u and v to be random variates on (0,1). Then

θ = 2πu

φ = cos-1(2v - 1)

gives the spherical coordinates for a set of points which are uniformly distributed over the surface of the sphere. This works since the differential element of solid angle is given by

dΩ = sinφdθdφ = -dθd(cosφ)

So for us to pick a random ra and dec on the sphere we want to use the following in idl:

theta = 2*!pi*RANDOMU(S, 1)
phi = acos(2*RANDOMU(S, 1) - 1)

This should give us a random theta and phi where θ ∈ [0,2π) and φ ∈ [0,π].

Unfortunately, just to be difficult HEALPix expects the following as inputs for ang2pix_nest:

theta - angle (along meridian), in [0,Pi], theta=0 : north pole
phi - angle (along parallel), in [0,2*Pi]

So their convention is the opposite of mine above so we should use the following:
phi = 2*!pi*RANDOMU(S, 1)
theta = acos(2*RANDOMU(S, 1) - 1)

Here is the code:

file = "maskfile.fits"
; nside_in is the number of pixels n the mask
read_fits_map, file, weight, nside=nside_in, ordering=order

; make random phi and theta
phi = 2*!pi*RANDOMU(S, 1)
theta = acos(2*RANDOMU(S, 1) - 1)

; Find pixel ipnest
ang2pix_nest, nside_in, theta, phi, ipnest

; Find weight of pixel
thisweight = weight[ipnest]

; check to make sure pixel center (thistheta, thisphi) is close to phi and theta
pix2ang_nest, nside_in, ipnest, thistheta, thisphi

(Checked this for several theta/phi and it looked good)


So here is the code to make the random catalog of size randomsize:

randomsize = 100
randomtheta = findgen(randomsize)*0.0
randomphi = findgen(randomsize)*0
index = 0

while index lt randomsize do begin
thisindex = index
while index eq thisindex do begin
phi = 2*!pi*RANDOMU(S, 1)
theta = acos(2*RANDOMU(S, 1) - 1)
ang2pix_nest, nside_in, theta, phi, ipnest
thisweight = weight[ipnest]
if thisweight gt 2.0 then begin
randomtheta[index] = theta
randomphi[index] = phi
index = index+1
endif
endwhile
endwhile

Friday, October 22, 2010

Randoms with a Mask

I wrote code to generate the randoms using Shirley's mask. Ran it, and now need to compare it to the actual data. The actual data isn't in a fits file.. it's an ASCII file with no header, so I am not exactly sure which column is what. Shirley left the country, so she is out of communication right now. Soon I should be able to check that the randoms fall over the data. Once I do that, I can run the correlation functions and see how the reconstruction looks on this data.

Exciting!

Alexia: Tomorrow is Cosmology in Northern California, so I will be busy with that all day. But it would be good for us to touch base on Monday again.

Wednesday, October 20, 2010

Doing Efficiencies Correctly

Adam and I have been going back and forth about the best way to compute the efficiencies and completeness for the likelihood paper.

Below is an email I sent him tonight:

> print, n_elements(where(stripe82targets.like_ratio_core gt 0.543214))/45.93
> 20.03

I get the same number as you here: 20.0305

But I am confused as to why you are calling this number to total
fibers. Wouldn't the fibers be the match between stripe82targets gt
threshold and all those in the BOSS_Quasars_3PCplus_v1.1_
wcomp.fits?
(I call this stripe82data)

Because what you calculate above is all potential targets that pass
the threshold, but a large fraction of them were not observed when you
look at the spherematch between stripe82targets and stripe82data:

; Total objects targeted
spherematch, stripe82targets.ra, stripe82targets.dec, stripe82data.ra,
stripe82data.dec, 2./3600, i1, i2, d12
totaltarget = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, totaltarget
11.5174

Which means that of the 20 objects per square degree we expected to
target, only 12.7 were actually targeted... this makes since when we
are using a spec completeness of 0.5... and should go up as we
increase go down in the table correct?

I agree that the area = 45.93 and the threshold = 0.543214

However when I do the below, I get:
> spherematch, stripe82targets.ra, stripe82targets.dec, stripe82quasars.ra,
> stripe82quasars.dec, 2./3600, i1, i2, d12
> print, n_elements(where(stripe82targets[i1].like_ratio_core gt
> 0.543214))/45.93
> 10.47

10.2330

Not sure where the 0.74 difference is coming from....

When I look at the total targets number of objects per square degree
that passed the likelihood threshold but were targeted, this was:

; Total targeted
spherematch, stripe82targets.ra, stripe82targets.dec, stripe82data.ra,
stripe82data.dec, 2./3600, i1, i2, d12
totaltarget = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, totaltarget
11.5174

Which means that we have only 1.284 targets per square degree that
were targeted by likelihood and not quasars correct?

The number that are lower redshift quasars is:

lowqsocut = where(stripe82data.z_conf_person GE 2 and
stripe82data.Z_PERSON GE 0.5 and stripe82data.Z_PERSON LT 2.15)
stripe82lowquasars = stripe82data[lowqsocut] ; with 0.5 < redshift < 2.15

spherematch, stripe82targets.ra, stripe82targets.dec,
stripe82lowquasars.ra, stripe82lowquasars.dec, 2./3600, i1, i2, d12

; Low Redshift Quasars targeted per square degree
lqpsd = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, lqpsd
0.326581



In terms of filling out your table I get the following numbers:
Spec Area Total # Like z > 2.15 other not ALL > 2.15 ALL other
Comp fibers targets QSOs QSOs QSOs QSOs QSOs

0.5 45.93 20.03 11.52 10.23 0.33 0.96 23.32 6.79
0.6 37.80 20.45 12.43 11.03 0.37 1.0 24.71 8.25
0.7 26.12 21.11 12.74 10.78 0.53 1.42 24.99 11.73
0.8 14.02 22.40 13.27 10.42 0.85 1.99 23.19 22.26
0.9 2.12 18.35 10.82 8.47 0.94 1.41 20.24 146.86


Below is the code I used to generate the table.
Let me know if you have thoughts.
Jessica


; Read in the targeting file
;erinfile = '/home/jessica/boss/boss-qso-stripe82median.fits'
;targets = mrdfits(erinfile, 2)
adamfile = '/home/jessica/boss/chunk1_wcomp.fits'
targets = mrdfits(adamfile, 1)

;stripe82cut = where((targets.dec GE -1.25) and (targets.dec LE 1.25)
and ((targets.ra GT 317) or (targets.ra LT 45)))
;stripe82targets = targets[stripe82cut]
stripe82targets = targets

newratio = stripe82targets.LIKE_RATIO_CORE

; Read in the truthtable file
truthfile = '/home/jessica/boss/BOSS_Quasars_3PCplus_v1.1_wcomp.fits'
collateinfo = mrdfits(truthfile, 1)
stripe82data = collateinfo
;High human confidence quasars (with z > 2.15)
confcut = where(stripe82data.z_conf_person GE 2 and
stripe82data.Z_PERSON GE 2.15)
stripe82quasars = stripe82data[confcut] ; with redshift > 2.15
lowqsocut = where(stripe82data.z_conf_person GE 2 and
stripe82data.Z_PERSON GE 0.5 and stripe82data.Z_PERSON LT 2.15)
stripe82lowquasars = stripe82data[lowqsocut] ; with 0.5 < redshift < 2.15


compthresh = 0.9
wphot = where(stripe82targets.completeness ge compthresh)
wspec = where(stripe82quasars.completeness ge compthresh)
wdata = where(stripe82data.completeness ge compthresh)

;plot, stripe82targets.ra, stripe82targets.dec, psym=3
;oplot, stripe82targets[wphot].ra, stripe82targets[wphot].dec, psym=3,
color=FSC_COLOR('blue')
;oplot, stripe82quasars[wspec].ra, stripe82quasars[wspec].dec, psym=7,
color=FSC_COLOR('red')

areas = mrdfits('/home/jessica/boss/area_wcomp.fits',1)

thisArea = areas(where(areas.comp_thresh eq compthresh)).area

stripe82targets = stripe82targets[wphot]
stripe82quasars = stripe82quasars[wspec]
stripe82data = stripe82data[wdata]

area = 219.9 ; we know the exact area of the chunk1 file
tpsd = round(20.0*area)
sortRatio = targets[reverse(sort(targets.LIKE_RATIO_CORE))].LIKE_RATIO_CORE
threshold = sortRatio[tpsd-1]

;Comp threshold
print, compthresh

;Area
print, thisArea


;Likelihood Targets per square degree
thistpsd = n_elements(where(stripe82targets.like_ratio_core gt
threshold))/thisArea
print, thistpsd

; Total objects targeted
spherematch, stripe82targets.ra, stripe82targets.dec, stripe82data.ra,
stripe82data.dec, 2./3600, i1, i2, d12
totaltarget = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, totaltarget



spherematch, stripe82targets.ra, stripe82targets.dec,
stripe82quasars.ra, stripe82quasars.dec, 2./3600, i1, i2, d12

; Likelihood Quasars targeted per square degree
qpsd = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, qpsd


spherematch, stripe82targets.ra, stripe82targets.dec,
stripe82lowquasars.ra, stripe82lowquasars.dec, 2./3600, i1, i2, d12

; Likelihood Low Redshift Quasars targeted per square degree
lqpsd = n_elements(where(stripe82targets[i1].like_ratio_core gt
threshold))/thisArea
print, lqpsd


;Likelihood targeted not qsos
print, totaltarget - qpsd - lqpsd


;Total z > 2.15 qsos
totalqsos = n_elements(stripe82quasars)/thisArea
print, totalqsos

;Total 0.5 < z < 2.15 qsos
totallowqsos = n_elements(stripe82lowquasars)/thisArea
print, totallowqsos

Tuesday, October 19, 2010

Cosmology in Northern California

In the small chance that anyone (other than Alexia) reads this blog, I feel like I should promote the conference I am organizing considering it takes away so much of my research time these days. Check it out:

Monday, October 18, 2010

Next steps...

I talked to Shirley and Alexia and they both agree that the procedure outlined in my last post makes sense. Shirley told me that the weights in her file don't mean anything, so I should just treat them is a step function. I need to talk to David to make sure he agrees with all this.

Thinking I should post-pone my trip to Los Alamos until December. Feeling like I'd like to have the data in better shape before I get there, and the flights are cheaper.

Steps:
  1. Talk to David and make sure he agrees with these steps. Also talk about funding for trip.
  2. Generate randoms as laid out in last post.
  3. Book tickets for December.
  4. Keep looking for temp housing in Los Alamos.

Another idea...

Another idea I have about generating this random catalog. First I generate the random ra and random dec, then I use the function ang2pix_nest (converts angle to pixel) to figure out which pixel it occurs in. Once we have that, depending on the weight we can keep the random value or throw it away. However, this might take a really long time if most of our objects are outside of the mask.

I also am not sure how to use the weight value. Should it be a step function, if weight > 0.2 then keep, if not then throw away, or something more subtle like the weight represents the expectation value of the probability distribution of if the random is in the pixel or not. In terms of the redshifts, should I do what I did before and match the random redshift distribution to the data?



Learning About HEALPix

I've been playing with Shirley's LRG catalog and mask. The mask uses something called HEALPix which I've never used before. It basically breaks up the spherical sky into equal area pixels:



The map is in the form of weights for each pixel. Shirley says that if the weight is greater than 0.2 she considers that pixel in the mask.

So to create a random catalog, I just go through the weights, and if it is greater than 0.2 I place objects in that pixel.

The key is to now convert from a Healpix mask to ra/dec.
There are two routines that do this that I have found:

DPF_PIX2ANG_NEST
pix2ang_nest

The problem is that these routines return the ra/dec of the center of the pixel. And to avoid putting extra power at the separations of the pixels, we want to select random points within the pixel. However, as you can see below, the ra range of these pixels changes dramatically as a function of cos (theta) (i.e. large at the poles and small at the equator) and so it isn't a simple procedure to fill in the area of a pixel evenly with random points:


I've searched to see if there is an existing package that does this, and I can't seem to find anything. In the HEALPix documentation there is a comment on random number generation, it doesn't seem to be what I am interested in here. I also have a email out to Shirley about this.

There are many functions that deal with HEALPix on their web page and well as in IDLUTILS. I'm currently combing through them to see if there is a function that gives me the ra/dec ranges for a given pixel, or even better, automatically picks a random point given a pixel.

Some questions:
  1. How do I assign a random place in a HEALPix pixel and convert this to a ra/dec position?
  2. The mask contains a "weight" for each HEALPix pixel, how to do factor in the weight into the random catalog. My intuition is that pixel's with higher weight should have more objects in them, but will this add clustering signal to the random catalog that we don't want?
  3. How do we distinguish what is signal because of observation effects (we only looked at that patch of the sky once, or we didn't target as many objects there) and what is signal due to clustering of the data (there is a huge galaxy cluster in that part of the sky, and that is why there is more data there)?

Tuesday, September 28, 2010

LRG Catalogs

Both Eyal and Shirley have responded pointing me towards their LRG catalogs. Shirley's comes with a mask, Eyal's comes with a random catalog, so this might be a better place to start.

There are some slight issue with Alexia using these catalogs because she is not officially in SDSS or BOSS.... ahh politics.

Digging into the data today.

Monday, September 27, 2010

Reconstruction on LRGs

I talked to Alexia today and we decided that I should get the BOSS data together and run correlation functions on it, rather than continuing to de-bug the reconstruction binning. I'm a little frustrated with that problem, and it would be nice to work on something new. I know that both Shirley Ho and Eyal Kazin have LRG catalogs, so I am contacting both of them to see if they are available for me to use.

Also planning to visit Alexia in late October/ early November. Trying to find cheap housing in Los Alamos, if anyone knows of anything out there...

Friday, September 24, 2010

Bad Blogger

To the 3 people that actually read my blog (Alexia, mom and dad), I know I have been a really bad blogger lately. I've been writing a paper, and didn't have much to post about that... and then I was in Paris for a SDSS Collaboration meeting.

Anyway, the Fall 2010 semester has started (how did that happen?!?) and I plan to get back to regular posting. I also am hoping to work exclusively on the reconstruction project now that the likelihood paper is (almost) done.

Apologies again. Here are some plots from the paper for your entertainment (the colors look a little less weird in black on white, but I inverted the images to keep with the blog color scheme):



Friday, September 17, 2010

Machine Gun Slide

Here is my machine gun talk slide from the Paris Collaboration meeting:

Thursday, September 9, 2010

Traveling - Paris

I'll be traveling for the next 10 days for a SDSS Collaboration Meeting, and personal vacation. Be back soon!

Greetings from Adam and Jess in Paris

Friday, August 27, 2010

Recomputing Efficiency and Completeness

I recomputed the Efficiency and Completeness using Stripe 82 BOSS Data:

To determine performance of the likelihood method, we calculated probabilities on single-epoch data in Stripe-82 and compared that target list with the commissioning truth table (which includes targets from all targeting methods and all previously known quasars). Likelihood probabilities were calculated on 112,809 objects and of those the top 1,450 likelihoods were selected for a target density of 40 objects per square-degree. We found an efficiency (# QSOs Found /# Total Targets) of 63% and the completeness (# QSOs Found by likelihood with 40 targets per square-degree / #QSOs found in region targeted by three methods with 80 targets per square-degree) is 55%.

The code to perform this calculation is here:
../logs/100827_2log.pro

It is a little strange because the truth table does not include all of stripe 82, so to understand the efficiencies we need to figure out the fraction of Stripe 82 which is covered by the truth table:

fraction = n_elements(targetNRA)*1.0/n_elements(where(stripe82targets.l_ratio GE threshold))*1.0 ;Fraction of stripe 82 observed

fraction = 14.5%

So the total number of targets is not the full number of objects in Erin's median seeing file, but ~14.5% of those objects.

Here are some plots:
Likelihood Selected QSOs:
Likelihood Missed QSOs:
Likelihood Selected/Missed QSOs together:
Likelihood Selected Stars:

All Three:

Code for these plots is also in the following log:
../logs/100827_2log.pro

More LF Plots


The code to make these plots is here:
../logs/100827log.pro

Thursday, August 26, 2010

Luminosity Function Plots

Here's the code to make it:file1 = '/home/jessica/lumnew/dNdz_hrh_z_0.5-5.0.dat'
file2 = '/home/jessica/lumnew/dNdz_jiang_z_0.5-5.0.dat'
file3 = '/home/jessica/lumnew/dNdz_richards_z_0.5-5.0.dat'
file4 = "/home/jessica/repository/ccpzcalib/Jessica/likelihood/qsocatalog/dNdz_qso_z0.5-5.0JiangCombo.dat"


rdfloat,file1,hrh_z,hrh_dndz,skip=1
hrh_pz=(hrh_dndz-shift(hrh_dndz,1))/(hrh_z-shift(hrh_z,1))
plot, hrh_z[1:119], hrh_pz[1:119]

rdfloat,file2,j_z,j_dndz,skip=1
j_pz=(j_dndz-shift(j_dndz,1))/(j_z-shift(j_z,1))
oplot, j_z[1:119], j_pz[1:119], color = 1

rdfloat,file3,r_z,r_dndz,skip=1
r_pz=(r_dndz-shift(r_dndz,1))/(r_z-shift(r_z,1))
oplot, r_z[1:119], r_pz[1:119], color = 2

;Turn on colors
device,true_color=24,decomposed=0,retain=2

;Set up fonts
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

white = GetColor('White', 1)
black = GetColor('Black', 2)
green = GetColor('Green', 3)
red = GetColor('Red', 4)
blue = GetColor('Blue',5)
cyan = GetColor('Cyan',5)
magenta = GetColor('Magenta',5)



mtitle = 'Luminosity Functions'
xtitle = 'redshift (z)'
ytitle = 'QSO Distribution'

Window, XSize=800, YSize=700
plot, hrh_z[1:119], hrh_pz[1:119], xrange = [0.5,5], yrange = [0,1], Background=1, Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=2, xthick=2, ythick=2

oplot, hrh_z[1:119], hrh_pz[1:119], Color=blue, thick=2
oplot, j_z[1:119], j_pz[1:119], color = red, thick=2
oplot, r_z[1:119], r_pz[1:119], color = green, thick=2



set_plot, 'ps'
!P.FONT = 0
!P.THICK = 1
aspect_ratio = (7.0/8.0)
xsz = 5.0
ysz = xsz*aspect_ratio
device, filename = 'luminosityFunctions.eps', /color, bits_per_pixel = 8,encapsul=0, xsize=xsz,ysize=ysz, /INCHES, /TIMES, /BOLD

plot, hrh_z[1:119], hrh_pz[1:119], xrange = [0.5,5], yrange = [0,1], Background=1, Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 1, charthick = 1, thick=2, xthick=2, ythick=2
oplot, hrh_z[1:119], hrh_pz[1:119], Color=blue, thick=2
oplot, j_z[1:119], j_pz[1:119], color = red, thick=2
oplot, r_z[1:119], r_pz[1:119], color = green, thick=2

device, /close
set_plot,'X'
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

All code is in following log file:
../logs/100826log.pro



code to make above plot is here:
../logs/100826_2log.pro

Wednesday, August 25, 2010

Comparing Luminosity Functions

Here is instructions for running different luminosity functions for comparison:

;Changing the luminosity function with Joe's new code
;code location: /home/jessica/lumnew

1) Change the location where the luminosity function is saved.
a) Open hiz_qso_dndz.pro
b) Change textout = './locationToSave.dat'
c) Change the LUMFUNCTION so that it is using the one you want it to:
LUMFUNCTION = 'HRH07' ; Hopkins, Richards, Hernquist 2007
;LUMFUNCTION = 'J06' ; Jiang 2006
;LUMFUNCTION = 'R06' ; Richards 2006
d) Change Z_MIN and Z_MAX to be the proper redshift range.

2) Run hiz_qso_dndz make quasar luminosity function:
idl runlumfunc.pro lum.out 2> lum.err &

3) Change hiz_kde_numerator.pro to point to the luminosity function:
dNdz_file = './dNdz_hrh_z_0.5-5.0.dat'
LUMFUNC = 'HRH07'

dNdz_file = './dNdz_jiang_z_0.5-5.0.dat'
LUMFUNC = 'J06'

dNdz_file = './dNdz_richards_z_0.5-5.0.dat'
LUMFUNC = 'R06'

4) Run hiz_kde_numerator.pro QSO Catalog:
idl hiz_kde_numerator.pro hkn.out 2> hkn.err &

Generates the following QSO Catalogs:
QSO_Catalog_HRH_07.fits
QSO_Catalog_Jiang_06.fits
QSO_Catalog_Richards_06.fits

5) Change likelihood_compute.pro so it points to new catalogs:

file2 = "/home/jessica/lumnew/QSO_Catalog_HRH_07.fits"
file2 = "/home/jessica/lumnew/QSO_Catalog_Richards_06.fits"
file2 = "/home/jessica/lumnew/QSO_Catalog_Jiang_06.fits"

6) Run likelihood.script

./likelihood.script hrh07
./likelihood.script j06run
./likelihood.script r06run

Also in the following log file:
.../logs/100825log.pro

Tuesday, August 17, 2010

Overlapping Contours


;Turn on colors
device,true_color=24,decomposed=0,retain=2

;Set up fonts
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

white = GetColor('White', 1)
black = GetColor('Black', 2)
green = GetColor('Green', 3)
red = GetColor('Red', 4)
blue = GetColor('Blue',5)

LoadCT, 33, NColors=levels, Bottom=3


;Set up contour data
bin_width = 0.05
minx=-1.0
miny=-0.5
maxx=4
maxy=2

star_contours = HIST_2D(ugscolor, grscolor, bin1=bin_width, bin2=bin_width, max1=maxx, MAX2=maxy, MIN1=minx, MIN2=miny)

;; This sets up (max1-min1)bin1 bins in X....
x_cont = findgen(n_elements(star_contours[*,1]))
sx_cont = (x_cont*bin_width) + minx

;; And 60 bins in y
y_cont = findgen(n_elements(star_contours[1,*]))
sy_cont = (y_cont*bin_width) + miny


qso_contours = HIST_2D(ugqcolor, grqcolor, bin1=bin_width, bin2=bin_width, max1=maxx, MAX2=maxy, MIN1=minx, MIN2=miny)

;; This sets up (max1-min1)bin1 bins in X....
x_cont = findgen(n_elements(qso_contours[*,1]))
qx_cont = (x_cont*bin_width) + minx

;; And 60 bins in y
y_cont = findgen(n_elements(qso_contours[1,*]))
qy_cont = (y_cont*bin_width) + miny

levels = 10
userlevels = findgen(levels)^2.5
maxlevel = max(userlevels)
smax = max(star_contours)*1.1
slevels = userlevels*smax/maxlevel

levels = 7
userlevels = findgen(levels)^2.5
maxlevel = max(userlevels)
qmax = max(qso_contours)*1.1
qlevels = userlevels*qmax/maxlevel

mtitle = 'Color-Color Contour Plot'
xtitle = 'u-g magnitude'
ytitle = 'g-r magnitude'

Window, XSize=800, YSize=700
Contour, star_contours, sx_cont, sy_cont, C_Colors=red, Background=1, Levels=slevels, Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=2, xthick=2, ythick=2
Contour, qso_contours, qx_cont, qy_cont, /Overplot, Levels=qLevels, Color=blue, thick=2


set_plot, 'ps'
!P.FONT = 0
!P.THICK = 1
aspect_ratio = (7.0/8.0)
xsz = 5.0
ysz = xsz*aspect_ratio
device, filename = 'contourqsostar.eps', /color, bits_per_pixel = 8,encapsul=0, xsize=xsz,ysize=ysz, /INCHES, /TIMES, /BOLD

Contour, star_contours, sx_cont, sy_cont, C_Colors=red, Background=1, Levels=slevels, Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 1, charthick = 1, thick=2, xthick=2, ythick=2
Contour, qso_contours, qx_cont, qy_cont, /Overplot, Levels=qLevels, Color=blue, thick=2

device, /close
set_plot,'X'
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

log file:

../logs/100817log.pro

Monday, August 16, 2010

Final Plotting

From this page we see how to do color contour plots: http://www.dfanning.com/tips/contour_hole.html

Here are the plots for the QSO and EE catalogs:


Here is the code to make them... I've used non-linear color levels to show peaks better...
For everything else catalog:

file1 = filepath('Likeli_everything.fits', root_dir=getenv('BOSSTARGET_DIR'), subdir='data')

;file1 = './Likeli_everything.fits.gz'
.com likelihood_compute
startemplate = likelihood_everything_file(file1, area=area1)


;Get magnitudes
;Star Magnitudes (don't need to deredden)
smag = 22.5 - 2.5*alog10(startemplate.psfflux>1.0d-3)
ugscolor = smag[0,*] - smag[1,*]
grscolor = smag[1,*] - smag[2,*]


;Turn on colors
device,true_color=24,decomposed=0,retain=2

;Set up fonts
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

bin_width = 0.025
minx=-1.0
miny=-0.5
maxx=4
maxy=2

;Create contours
star_contours = HIST_2D(ugscolor, grscolor, bin1=bin_width, bin2=bin_width, max1=maxx, MAX2=maxy, MIN1=minx, MIN2=miny)

;; This sets up (max1-min1)bin1 bins in X....
x_cont = findgen(n_elements(star_contours[*,1]))
x_cont = (x_cont*bin_width) + minx

;; And bins in y
y_cont = findgen(n_elements(star_contours[1,*]))
y_cont = (y_cont*bin_width) + miny

;Set up plot titles
xtitle = 'u-g magnitude'
ytitle = 'g-r magnitude'
mtitle = 'Color-Color Contour Plot EE Catalog'

levels = 256 ;plot in 256 colors
data = star_contours
x = x_cont
y = y_cont
white = GetColor('White', 1) ; know what is white
black = GetColor('Black', 2) ; know what is black
;load blue-red color table
LoadCT, 33, NColors=levels, Bottom=3

;define the levels of the color contours
;non linear to show data shape better
userlevels = findgen(levels)^3
maxlevel = max(userlevels)
thismax = max(star_contours)*1.1
userlevels = userlevels*thismax/maxlevel

;define the levels of the contour lines
;non linear to show data shape better
;only ten lines so that the text is legible
slevels = 10
suserlevels = findgen(slevels)^3
maxlevel = max(suserlevels)
thismax = max(star_contours)*1.1
suserlevels = suserlevels*thismax/maxlevel

; make window
Window, XSize=800, YSize=700
;Make contour plot
Contour, data, x, y, /Fill, C_Colors=Indgen(levels)+3, Background=1, Levels=userlevels, Position=[0.12, 0.1, 0.9, 0.80], Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=1, xthick=2, ythick=2
;Overplot black lines
Contour, data, x, y, /Overplot, Levels=suserLevels, /Follow, Color=black
;Make color bar
ColorBar, NColors=levels, Bottom=3, Divisions=slevels-1, TICKNAMES=suserlevels, Range=[Min(data), Max(data)], Format='(G8.5)', Position = [0.12, 0.9, 0.9, 0.95], Color=black


Similarly for the qso catalog:


bin_width = 0.03
minx=0.0
miny=-0.5
maxx=3.0
maxy=1.5

qso_contours = HIST_2D(ugqcolor, grqcolor, bin1=bin_width, bin2=bin_width, max1=maxx, MAX2=maxy, MIN1=minx, MIN2=miny)

;; This sets up (max1-min1)bin1 bins in X....
x_cont = findgen(n_elements(qso_contours[*,1]))
x_cont = (x_cont*bin_width) + minx

;; And 60 bins in y
y_cont = findgen(n_elements(qso_contours[1,*]))
y_cont = (y_cont*bin_width) + miny


xtitle = 'u-g magnitude'
ytitle = 'g-r magnitude'
mtitle = 'Color-Color Contour Diagram of QSO Catalog'

levels = 256
data = qso_contours
x = x_cont
y = y_cont
white = GetColor('White', 1)
black = GetColor('Black', 2)
LoadCT, 33, NColors=levels, Bottom=3

userlevels = findgen(levels)^3
maxlevel = max(userlevels)
thismax = max(qso_contours)*1.1
userlevels = userlevels*thismax/maxlevel

slevels = 7
suserlevels = findgen(slevels)^3
maxlevel = max(suserlevels)
thismax = max(qso_contours)*1.1
suserlevels = suserlevels*thismax/maxlevel


Window, XSize=800, YSize=700

Contour, data, x, y, /Fill, C_Colors=Indgen(levels)+3, Background=1, Levels=userlevels, Position=[0.12, 0.1, 0.9, 0.80], Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=1, xthick=2, ythick=2
Contour, data, x, y, /Overplot, Levels=suserLevels, /Follow, Color=black
ColorBar, NColors=levels, Bottom=3, Divisions=slevels-1, TICKNAMES=suserlevels, Range=[Min(data), Max(data)], Format='(G10.5)', Position = [0.12, 0.9, 0.9, 0.95], Color=black



And to make the plots to files:

set_plot, 'ps'
!P.FONT = 0
!P.THICK = 1
aspect_ratio = (7.0/8.0)
xsz = 7.0
ysz = xsz*aspect_ratio
device, filename = 'contourstars.eps', /color, bits_per_pixel = 8,encapsul=0, xsize=xsz,ysize=ysz, /INCHES, /TIMES, /BOLD

Contour, data, x, y, /Fill, C_Colors=Indgen(levels)+3, Background=1, Levels=userlevels, Position=[0.11, 0.1, 0.9, 0.80], Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 1, charthick = 1, thick=1, xthick=2, ythick=2
Contour, data, x, y, /Overplot, Levels=suserLevels, /Follow, Color=black
ColorBar, NColors=levels, Bottom=3, Divisions=slevels-1, TICKNAMES=suserlevels, Range=[Min(data), Max(data)], Format='(G8.5)', Position = [0.11, 0.9, 0.9, 0.95], Color=black



device, /close
set_plot,'X'
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT







bin_width = 0.03
minx=0.0
miny=-0.5
maxx=3.0
maxy=1.5

qso_contours = HIST_2D(ugqcolor, grqcolor, bin1=bin_width, bin2=bin_width, max1=maxx, MAX2=maxy, MIN1=minx, MIN2=miny)

;; This sets up (max1-min1)bin1 bins in X....
x_cont = findgen(n_elements(qso_contours[*,1]))
x_cont = (x_cont*bin_width) + minx

;; And 60 bins in y
y_cont = findgen(n_elements(qso_contours[1,*]))
y_cont = (y_cont*bin_width) + miny


xtitle = 'u-g magnitude'
ytitle = 'g-r magnitude'
mtitle = 'Color-Color Contour Diagram of QSO Catalog'

levels = 256
data = qso_contours
x = x_cont
y = y_cont
white = GetColor('White', 1)
black = GetColor('Black', 2)
LoadCT, 33, NColors=levels, Bottom=3

userlevels = findgen(levels)^3
maxlevel = max(userlevels)
thismax = max(qso_contours)*1.1
userlevels = userlevels*thismax/maxlevel

slevels = 7
suserlevels = findgen(slevels)^3
maxlevel = max(suserlevels)
thismax = max(qso_contours)*1.1
suserlevels = suserlevels*thismax/maxlevel

set_plot, 'ps'
!P.FONT = 0
!P.THICK = 1
aspect_ratio = (7.0/8.0)
xsz = 7.0
ysz = xsz*aspect_ratio
device, filename = 'contourqsos.eps', /color, bits_per_pixel = 8,encapsul=0, xsize=xsz,ysize=ysz, /INCHES, /TIMES, /BOLD

Contour, data, x, y, /Fill, C_Colors=Indgen(levels)+3, Background=1, Levels=userlevels, Position=[0.12, 0.1, 0.9, 0.80], Color=black, XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 1, charthick = 1, thick=1, xthick=2, ythick=2
Contour, data, x, y, /Overplot, Levels=suserLevels, /Follow, Color=black
ColorBar, NColors=levels, Bottom=3, Divisions=slevels-1, TICKNAMES=suserlevels, Range=[Min(data), Max(data)], Format='(G10.5)', Position = [0.12, 0.9, 0.9, 0.95], Color=black

device, /close
set_plot,'X'
!P.FONT = -1
!P.THICK = 1
DEVICE, SET_FONT ='Times Bold',/TT_FONT

See they look pretty:



All in the following log file:
../logs/100816log.pro

Friday, August 13, 2010

More Plotting

More plotting....

Plot to a file:
;Save color-color diagram to file
set_plot, 'ps'
!P.FONT = 0
!P.THICK = 2
aspect_ratio = (6.0/7.0)
xsz = 4.0
ysz = xsz*aspect_ratio
device, filename = 'ccstars.eps', /color, bits_per_pixel = 8,encapsul=0, xsize=xsz,ysize=ysz, /INCHES, /TIMES, /BOLD
ccplot, xobject, yobject, xtitle, ytitle, mtitle
device, /close
set_plot,'X'

or (from here -- note I needed to change /encapsul command to device to encapsul=0)

to_eps, /eps, device=device
ccplot, xobject, yobject, xtitle, ytitle, mtitle
to_eps, /eps, device=device, /out

It looks pretty:
Contour plot with non-linear contour levels:
!P.FONT = -1
mtitle = 'Color-Color Contour Plot'
levels = findgen(25)^2*16
contour,star_contours,x_cont,y_cont,levels=levels,XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=1, xthick=1,/follow



Contour plot shaded:
levels = findgen(25)^2*16
contour,star_contours,x_cont,y_cont,levels=levels,XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=1, xthick=2, ythick=2,/cell_fill


To overlay levels:
window,xsize=700,ysize=600
loadct, 0
nlevels = 15
levels = findgen(nlevels)^4
maxlevel = max(levels)
thismax = max(star_contours)
levels = levels*thismax/maxlevel
contour,star_contours,x_cont,y_cont,levels=levels,XTITLE = xtitle, YTITLE = ytitle, TITLE = mtitle, charsize = 2, charthick = 1, thick=1, xthick=2, ythick=2,/cell_fill
contour,star_contours,x_cont,y_cont,levels=levels,/follow,/overplot