I have a project with a moderate to large intravenous infusion volume (up to 500 mL). The infusion happens over 30 minutes to 2 hours depending on the study and the subject.
While working on a combined PK/PD model that includes an endogenous protein in blood, there is an offset that resolves over the course of about one day. It does not appear to affect the amount of the endogenous protein, just its concentration. My hypothesis is that this is hemodilution due to the volume of infusion and its urinary elimination. Currently, I’m handling this as a multiplicative modification to the concentration of the endogenous protein with first-order elimination relative to time after dose (and doses are administered far enough apart that no hemodilution will overlap between doses).
Has anyone else run into hemodilution or something similar in a model? If so, have you used anything other than a first-order elimination? (Or, if you had less detailed measurements, I would assume an offset factor for the first measure.)
Thanks in advance,