================================================================================ Title: Searching for GEMS: Characterizing Six Giant Planets around Cool Dwarfs Authors: Kanodia S., Gupta A.F., Canas C.I., Bernabo L.M., Reji V., Han T., Brady M., Seifahrt A., Cochran W.D., Morrell N., Basant R., Bean J., Bender C.F., de Beurs Z.L., Bieryla A., Birkholz A., Brown N., Chapman F., Ciardi D.R., Clark C.A., Cotter E.G., Diddams S.A., Halverson S., Hawley S., Hebb L., Holcomb R., Howell S.B., Kobulnicky H.A., Kowalski A.F., Larsen A., Libby-Roberts J., Lin A.S.J., Lund M.B., Luque R., Monson A., Ninan J.P., Parker B.A., Patel N., Rodruck M., Ross G., Roy A., Schwab C., Stefánsson G., Thoms A., Vanderburg A. ================================================================================ Description of contents: This tar.gz archive contains individual numpy (npy) binary pickled python (v3.9) dictionaries for each TOI created using numpy (v1.20). These dictionaries contain the photometry time series (including flux errors) for each instrument used in this analysis, as well as the model used for detrending where applicable. A python script is included that displays how to use the python dictionaries to extract and plot the included data values. The specific files included in this tar.gz archive are: File md5 checksum PhotDictTOI5176.npy (K2-419) d720e989d1212d9ce9b5ca53b096e5cf PhotDictTOI5218.npy 984f360db719c8b27b014ad148d58e79 PhotDictTOI5414.npy 9ab63e2b9b77c749dee890431fdbef9c PhotDictTOI5616.npy 9135e7ac6549df56d4f8eaa276c2a601 PhotDictTOI5634.npy b4695c58991e5383f05a5a3afffb9502 PhotDictTOI6034.npy 436dcb52eb167fe0464128a8827fb241 TestScript.py System requirements: These are python pickled (v3.9) dictionaries. The dictionary can be extracted as follows: # Tested on numpy version 1.22.3 import numpy as np PhotDict = np.load("PhotDictTOI5176.npy", allow_pickle=True).item() Additional comments: A listing of the dictionary keys is provided here. Each "key" is a prefix that applies to a set of instruments. For example, for TOI 5616 there are four instrument data sets included: [TESS_S22', 'TESS_S48', 'TESS_S49', 'KeplerCam_20220531]. GP_Prot_ Boolean If True, simultaneous GP detrending performed with the Prot kernel GP_SHO_ Boolean If True, simultaneous GP detrending performed with the SHO kernel RunMaskedTransit_GP_ Boolean If True, then a GP was fit to the transit masked light curve and subsequently subtracted SubtractTransitMaskedGP_ Boolean If True, then the GP fit on the transit-masked light curve was subtracted from the y_phot to detrend it dilution_ Boolean If True, dilution was allowed to float and the transit depth was constrained based on other instruments with dilution = False smooth_ array(float) Savgol smoothened flux array t_phot_ array(float) Super-sampled time axis, more relevant for long cadence datasets texp_ float Exposure time (days) transit_mask_ array(boolean) Boolean mask +/- 0.25 days around each transit x_phot_ array(float) Time axis, BJDTDB (days) x_phot_ref_ integer Time offset subtracted from x_phot_. Set to 0 here. y_phot_ array(float) Flux, not median normalized yerr_phot_ array(float) Flux Error Instruments included. The ZTF data, given the cadence, is not utilized during the joint fitting (Section 4.2) and is not included in these dictionaries: TOI5176.npy (K2-419) 'TESS', 'K2_C5', 'K2_C18', 'RBO_20220405', 'SWOPE_20230115', 'Keeble_20240206' TOI5218.npy 'TESS_S14', 'TESS_S15', 'TESS_S16', 'TESS_S17', 'TESS_S18', 'TESS_S19', 'TESS_S20', 'TESS_S21', 'TESS_S22', 'TESS_S23', 'TESS_S24', 'TESS_S25', 'TESS_S26', 'TESS_S40', 'TESS_S41', 'ARCTIC_SDSSi_20220513', 'ARCTIC_SDSSi_20220712' TOI5414.npy 'TESS_S20', 'TESS_S44', 'TESS_S45', 'TESS_S47', 'LCO_20221112' TOI5616.npy 'TESS_S22', 'TESS_S48', 'TESS_S49', 'KeplerCam_20220531' TOI5634.npy 'TESS_S22', 'TESS_S49', 'KeplerCam_20230105', 'LCRO_20230320' TOI6034.npy 'TESS_S16', 'TESS_S17', 'TESS_S18', 'TESS_S25', 'TESS_S56', 'TESS_S57', 'RBO_20230620', 'RBO_20230808', 'RBO_20230908', 'Keeble_20231019' ================================================================================