Skip to content

WindowManager Array Transition (part 3)#11471

Open
amirroth wants to merge 7 commits intodevelopfrom
WindowManager4
Open

WindowManager Array Transition (part 3)#11471
amirroth wants to merge 7 commits intodevelopfrom
WindowManager4

Conversation

@amirroth
Copy link
Collaborator

Transitions most remaining uses of ObjexxFCL ArrayX to std::array. The remaining uses are the input processor and something having to do with blind slat profiles where the current implementation looks like an off-by-1 bug but I don't have the time or patience to investigate it.

@amirroth amirroth added Refactoring Includes code changes that don't change the functionality of the program, just perform refactoring NotIDDChange Code does not impact IDD (can be merged after IO freeze) labels Mar 15, 2026
@amirroth amirroth added this to the EnergyPlus 26.1 milestone Mar 15, 2026
Copy link
Collaborator Author

@amirroth amirroth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Quick walk-through. Not much to see. Mostly tedium.

std::array<Real64, maxGapLayers> &hgap, // Gap gas conductive conductance (W/m2-K)
Array2D<Real64> &Aface, // Coefficient in equation Aface*thetas = Bface
Array1D<Real64> &Bface // Coefficient in equation Aface*thetas = Bface
std::array<std::array<Real64, 2 * maxGlassLayers>, 2 * maxGlassLayers> &Aface, // Coefficient in equation Aface*thetas = Bface
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The big kahunas.


void LUdecomposition(EnergyPlusData &state,
Array2<Real64> &ajac, // As input: matrix to be decomposed;
std::array<std::array<Real64, 2 * maxGlassLayers>, 2 * maxGlassLayers> &ajac, // As input: matrix to be decomposed;
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have to be very careful with index arithmetic here. I think I got this right.

if (i < wm->nglface) {
rguess(i + 1) = resgap;
for (int i = 1; i < wm->nglface; i += 2) {
rguess[i] = 1.0 / wm->scon[i / 2];
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another index arithmetic gotcha.

--L;
}
} // W5LsqFit2()

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have not been using this version for quite a while.

bld_pr[3] = 0.0;
bld_pr[4] = 0.0;
bld_pr[5] = 0.0;
bld_pr[6] = matBlind->slatTAR.Sol.Ft.Bm[0].DfTra;
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

At some point, we can just pass a const reference to this data structure instead of packing everything into an array and unpacking it INTO THE SAME STRUCTURE in the called function.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤯

for (int k = 1; k <= i - 1; ++k) {
sum -= ajac(k, i) * ajac(j, k);
for (int j = 0; j < n; ++j) {
for (int i = 0; i < j; ++i) { // should this be j-1?
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Risky to say anything here but i < j does look correct. I just substituted values for i, j, and k and it seems the loops do the same thing now.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree. I think i < j is correct.

std::array<std::array<Real64, 6>, 6> F; // View factor array
std::array<std::array<Real64, 2 * maxGlassLayers>, 2 * maxGlassLayers> X; // Exchange matrix
std::array<std::array<Real64, 2 * maxGlassLayers>, 2 * maxGlassLayers> Xinv; // Inverse of exchange matrix
std::array<int, 2 * maxGlassLayers> indx; // LU decomposition indices
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've seen these 2 * maxGlassLayers a lot. If int constexpr maxGlassLayers = 5; then would it be beneficial to create int constexpr twoTimesMaxGlassLayers = 10; ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems reasonable to me.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FWIW, the compiler reduces any expression that consists only of constants (otherwise, this wouldn't compile because the second argument to the std::array template has to be a constant). Replacing this with twoTimesMaxGlassLayers doesn't save any runtime, makes the code slightly longer, and in my opinion, also slightly less clear. That said, it's six on one hand.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for that info. As you can see, I already pushed the change, but it doesn't sound like that's a problem.


indx = 0;
indx = {0};
InvertMatrix(state, X, Xinv, indx, 4); // Autodesk:Note X modified by this call
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of scope but indx is getting initialized twice, once here and once in the function (line 8241).

@github-actions
Copy link

⚠️ Regressions detected on ubuntu-24.04 for commit 76782e0

Regression Summary
  • MTD: 28
  • RDD: 21
  • Audit: 18
  • MTR Big Diffs: 2
  • Table Big Diffs: 2
@mitchute
Copy link
Collaborator

I'll rerun regressions on this soon. The current results don't make sense to me.

@mitchute
Copy link
Collaborator

Regressions look good now. @jmarrec I don't think I did things out of order on this, so I'm a bit confused on the false positive on the regressions here. Flagging for your awareness.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

NotIDDChange Code does not impact IDD (can be merged after IO freeze) Refactoring Includes code changes that don't change the functionality of the program, just perform refactoring

4 participants