Comments (11)
Ryan Knox has a version of the code with XML dependencies changed which may solve your problem, and which he might be able to share. I use this code as well, so if he doesn't have a version of the model in hand to help you with, I might be able to as well.
Also, just in case you didn't already know, there are already at least three bodies of code that you might find useful for the data assimilation:
- Dietze Labs' Pecan Software
- Ke Zhangs Fortran DRAM version of ED
- My SA/DRAM Matlab program which calls ED iteratively, changing parameters w/ the XML
from ed2.
Unless @rgknox has updated it, the version he proposed was putting a lot of the calculations after the XML read, and thus would probably exacerbate this problem
@DanielNScott, Toni's part of the PEcAn team. He's working with @serbinsh to do some uncertainty analyses of the optical traits leading up to an effort to assimilate hyperspectral imagery. Also, I don't know the details of what's going on outside of PEcAn's efforts, but I think Toni's the only one who's doing state data assimilation rather than parameter calibration.
What are the "DRAM" versions of ED? Where do we find more about what you and Ke have done? These aren't in the mainline, so are they in other private repositories/forks? Are they documented somewhere? Are all three efforts just reinventing the same wheel?
from ed2.
Hi Toni et al.,
As others have indicated, I looked into something slightly different
previously, but not really what I think you are proposing. Good luck with
your improvements, let us know if we can help or test.
On Sat, Feb 21, 2015 at 5:04 AM, Michael Dietze [email protected]
wrote:
Unless @rgknox https://github.com/rgknox has updated it, the version he
proposed was putting a lot of the calculations after then@DanielNScott https://github.com/DanielNScott, Toni's part of the PEcAn
team. He's working with @serbinsh https://github.com/serbinsh to do
some uncertainty analyses of the optical traits leading up to an effort to
assimilate hyperspectral imagery. Also, I don't know the details of what's
going on outside of PEcAn's efforts, but I think Toni's the only one who's
doing state data assimilation rather than parameter calibration.What are the "DRAM" versions of ED? Where do we find more about what you
and Ke have done? These aren't in the mainline, so are they in other
private repositories/forks? Are they documented somewhere? Are all three
efforts just reinventing the same wheel?—
Reply to this email directly or view it on GitHub
#26 (comment).
from ed2.
Whew there's a lot to answer to here...
@mdietze As far as I know you're right about Toni being the only one doing state data assimilation, though I don't know who everyone is and I don't know what everyone's working on either. I did think that was something you folks with PEcAn were doing, hence your place at the top of the list of having potentially relevant tools. Maybe if other folks are doing so or intend to they can chime in?
Re. other versions of ED...
@ke-zhang has a version of the model (which I believe has some connection to David Medvigy, but you'll have to check with Ke) with a parameter estimation scheme built into it, in Fortran. I believe the particular iterative algorithm used is a delayed rejection adaptive metropolis algorithm. I inherited this model through some confluence of reasons not entirely clear to me but involving Ke's usage and some conception that PEcAn was not an appropriate alternative at the time. I don't know more about that decision, since I didn't make it. In any case I ultimately found the code I got from Ke overly cumbersome, both to keep up to date with the main line and in other ways, so I reverted to using my version of the mainline, i.e. the mainline + various additions for tracking C-13 and created a sort of Matlab wrapper to do the iterating and fitting. That "wrapper" also does simulated annealing at this point.
As far as documentation and code, that's a good question. I assume Ke could say if/how/where the code he uses is documented, and where said code is. As far as the Matlab code I'm using, you can find it on my Github page, though both the code and the documentation are a little out of date at the moment. I would happily update either/both!
Lastly, re. reinventing the wheel, somewhere along the line something like that may have happened. I don't really know; I don't really know when or with whom Ke's code originated, or for that matter when or with whom PEcAn originated. But given that PEcAn is much more than just a simple search algorithm or sampling method on top of ED, it doesn't really seem like using less sophisticated tools, those you're accustomed to or given, or some straightforward derivative thereof constitutes reinventing the wheel. So I don't believe anyone is presently, or has been recently, doing so or attempting to do so.
from ed2.
Hi @rgknox I just wanted to follow up briefly on this thread. Awhile back you and I communicated about a branch of the ED2 model where you had modified ed_params.f90 to allow for the use of the XML file to update the radiation parameters (e.g. leaf/branch/soil refl/trans and therefore the canopy/soil albedo) to examine the impacts of parameterization/RTM on ED2 projections (e.g. see attached slide). I still have that version of the code but I had thought that those changes were going to make it into the mainline. Before @Viskari and I work on moving some items/calculations out of ed_params I wanted to be sure that indeed this has not yet been implemented and that we missed it.
In general this question isn't specific to DA but instead delving into the RTM properties of ED as we need to make some modifications there before we can begin to look at assimilating or using observations to inform RTM params/choice.
Thanks!
from ed2.
@rgknox Oh, and one more thing. In the example I provided we stuck with the broadband "binning" of the VIS/NIR data but we will be making modifications to allow for the use of spectral data to inform the canopy albedo. That is why you see for example "leaf_trans_nir" and "leaf_trans_vis"
from ed2.
Hi Shawn,
I do not plan to implement any of my previous modifications on ed_params.f90 because we ran into conflicts with how updated parameters may had been intended to be controlled through the xml (and therefore overwritten). I would be happy to have a phone call with you guys if you want to resurrect any of that stuff.
from ed2.
Hi Ryan.
Yes, I think a small discussion on this is in order. Currently I think the relevant group includes @Viskari @mdietze @blowfish711 and possibly @mpaiao, as well as yourself. Any others?
Basically, before we get to far we need to sort out what changes actually made it into the GitHub version, who might be making specific modifications (so we don't duplicate) and define who will head up the modifications to XML and ed_params.f90. I will create a doodle to try and sort this out.
from ed2.
Before the meeting, I will look back at some older ED2 versions I was using to diff the versions and check whats different and what we may need to (re)implement.
from ed2.
@serbinsh The general clean up of the XML was discussed in the most recent ED2 telecon (last week) and we thought that this specific issue was just a subsection of the more general XML clean up that was agreed to at the AGU meeting. Please let us know if that's not the case. Specifically, we agreed to make sure that all ED2 parameters were in the XML so that we could eventually ditch ed_params completely. Not sure this needs a whole telecon, but it would be good for you, as the RS and RTM expert, to clarify what makes the most sense to define as the specified parameters. I lean toward not doing ANY calculation after the XML, but if there are derived quantities that involve physical constraints that have to be met between quantities then this should be in there. Where I'd argued with @rgknox previously was about the fact that many of the "calculations" in ed_params are not physical constraints but empirical trends, and thus users need to be able to input parameter combinations that deviate from those trends (eg as new data becomes available).
Also discussed in the last telecon was that the clean up was being coordinated by @rgknox and I, with help from @mpaiao. Would love help from you and Toni
from ed2.
@mdietze I agree that almost all parameters should go to xml. There are only a few parameters that have physical constrains (e.g. the total scattering coefficient and emissivity in TIR must always add up to 1). But in this case we should select which variable will be set through xml, and which one must be internally calculated, and make sure the latter cannot be set through xml.
from ed2.
Related Issues (20)
- little mistake about the unit description in the output file HOT 5
- how to modify the settings to increase the NPLANT of pft 3 ? ? HOT 2
- Issue with HPC build HOT 6
- soil water off-track HOT 1
- Problem writing the T-file HOT 12
- NPLANT
- little mistake in adjust_veg_properties of rk4_misc HOT 3
- question about unit in reproduction.f90 HOT 3
- Typo in the events.f90 file HOT 3
- Docker image not being updated by GH actions
- `HYDAULIC_MORTALITY_SCHEME` missing from ED2IN HOT 4
- ED2 Installation error
- Patch Bfusion budget failed (CO2 Summary: FINE: F)
- hdf5 problem when running HISTORY mode HOT 4
- question about MPI HOT 1
- Crop yield estimation
- psi_max or psi_min in leaf phenology HOT 1
- typo in calc_plant_water_flux HOT 2
- Issue with compilation in a server HOT 4
- unable to synchronously open file
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ed2.