The problem was that I had an overeager HttpInterceptor that was converting all HTTP error responses into a string.
The intercepter was simply returning:
err.statusText
Which was why I only saw a string, not an HttpErrorResponse object.
The Correlation Error for our team was caused by a short RemoteAuthenticationTimeout setting. It was intermittent, due to the timeout being reached in some instances and not others. It's unfortunate the same error can be linked to so many different circumstances.
In our case I think the variable was falsely set and expected to perform as a Request Timeout, when it is actually accounting for the time limit of completing the authentication flow.
I'd guess that your issuer isn't specified in your openid config. You'd have to decode your JWT to verify that. If that's the problem, you just need to add all of your issuers to that validate-jwt policy.
Since you're doing everything in Entra Id, you might consider using a validate-azure-ad-token policy instead of a validate-jwt policy since that already checks for all of the issuers that might issue your tokens.
this error are getting two three days.
You can change the owner of database to the new user:
ALTER DATABASE airflow OWNER TO airflow;
I'm imagining that in the foreach loop, you are persisting data outside the code logic you have.
Can you update the question to contain the logic in comment:
//Do stuff with $row
Because if you save data outside of that statement, it remains in memory. So while you have an optimization in regards to chunking the
DB::SELECT(...)
The memory leak remains since data is likely aggregated outside of the relevant code you provided.
Using Redis could potentially help alleviate some of the pressure on your Aerospike database, especially if your application has high read demands or if there’s some data that doesn’t need to be fetched from Aerospike in real time for every request.For data that doesn’t change frequently (like user profiles, configuration settings, or lookup data), you could store it in Redis as a cache layer in front of Aerospike.
By doing so you can:
Redis is also good for Distributed Locking or Rate limiting specifically in a concurrent environment. Redis’s distributed locking mechanism (using something like Redlock) can help ensure only one goroutine or instance accesses Aerospike at a time for specific actions.
just remove "chakra-ui-color-mode" from localstorage and refresh the page. that should be fixed
The solution for me was to correct the "homepage": "/" in my package.json. I had been fiddling with it earlier for some github pages preview.
Soo check that as well.
If the accepted answer don't work because of some default styling set the !important property in the global file.
.hide-dropdown {
display: none !important;
}
This fixed my issue.
Thanks to @nc1943 above for their excellent answer. I wanted to set system-level environmental variables and also hit a couple of minor errors when trying to use the code above. I've tweaked it and if run in an Admin PowerShell session it successfully imports the variables:
param(
[string]$Path,
[switch]$Verbose,
[switch]$Remove,
[switch]$RemoveQuotes
)
$variables = Get-Content -Path $Path | Select-String -Pattern '^\s*[^\s=#]+=[^\s]+$'
foreach ($var in $variables) {
$keyVal = $var -split '=', 2
$key = $keyVal[0].Trim()
if ($RemoveQuotes) {
$val = $keyVal[1].Trim("'").Trim('"')
} else {
$val = $keyVal[1]
}
if ($Remove) {
[Environment]::SetEnvironmentVariable($key, '', [System.EnvironmentVariableTarget]::Machine)
} else {
[Environment]::SetEnvironmentVariable($key, $val, [System.EnvironmentVariableTarget]::Machine)
}
if ($Verbose) {
"$key=$([Environment]::GetEnvironmentVariable($key, [System.EnvironmentVariableTarget]::Machine))"
}
}
Those are the Windows file paths, but you're running the program in WSL. Your paths need to be relative to the /
route. Also, UNIX paths use forward slashes, not backslashes.
The correct route would be ~/lab1/obrada.txt
where the tilde is the /home/USER
directory
You should set up the message field of the annotation.
@NotNull(message = "{org.company.User.name.NotNull}")
private String name;
select distinct city from station where city REGEXP '[aeiou]$';
I feel your pain. IMHO breaking backwards compatibility requires a very good reason, and I'm not sure there is one in this case. See this github issue
images not image
https://nextjs.org/docs/app/api-reference/functions/generate-metadata#metadatabase
openGraph: {
images: '/og-image.png',
},
Since you mentioned in the comment section that you are using Cloud Run Functions as the service, I recommend reviewing this documentation, which will guide you on how to deploy a Cloud Run function from source code and help you understand the build process during deployment.
As suggested also by John, you can also view runtime logs for Cloud Run Functions in the Logs Explorer so you could see the actual error. You can also check the Cloud Run Functions troubleshooting errors page which might help you.
I hope this helps.
I had Xcode 15.4. I tried to run the app on an old simulator (iOS 15) that I had downloaded before.
The simulators for iOS 15 simply were not visible in the simulators list.
The trick was to go to Window > Devices and Simulators, find the needed device you want to use and switch the option from "Automatic" to "Always".
Ok so I could not find a way around the mentioned limitations with WPF. So what I did instead was add a brief timer to my HOLD button that keeps the GO button enabled. User touches HOLD then GO. All good. Not exactly what I wanted but close enough to work for the user. Interestingly enough, I've now seen multiple other windoze tablet UIs that use the same approach, so I'm guessing I'm not the only one to run into this issue.
I am also confronted with this issue, my code works fine on desktop Signale client, but when running it as service, the service will not connect die to an SSl error
The problem ended up being that our Content-Type
header was:
Content-Type: */*; charset=utf-8
Instead of:
Content-Type: text/HTML; charset=utf-8
Twitter/X, LinkedIn, and Apple Messages choked on */*
.
I don't know. heh 😁 ...............................................................................................................................................................................................
<MARQUEE behavior="scroll" direction="left" width="100%">
<img src="yourimagehere1.gif">
<img src="yourimagehere2.gif">
<img src="yourimagehere3.gif">
</MARQUEE>
I agree. I've noticed that the articles like this one that use pca.components_ * np.sqrt(pca.explained_variance_)
seem to cite the theory behind what PCA loadings are to explain their reasoning. On the other hand, the ones that use abs
or normalize the abs
of the pca.components_
seem to focus on feature importance like this.
I think this blog best explains it in its section titled "Difference Between Loadings, Correlation Coefficients and Eigenvectors."
I hope SciKit fixes this in their guides and adds a method to compute it soon.
In the SQL Editor window, click on the database name at the top (or Ctrl+0, that's Ctrl+zero) to open the "Choose catalog/schema" dialog. Select the database (catalog) in the left pane by clicking on it, then select the schema in the right pane by double-clicking on it, which also closes the dialog. Click on the following images to see screenshots.
You can change port location by going inside the subsystem, double clicking the Connection Port, and changing the Port location on parent subsystem
attribute.
hi can you provide me the code for clearkey drm in exoplayer? i am working on the same project but i have been facing the source issue. even though i have tested it with the above link you have provided and its working just fine but in the app its giving me source issue. it will be very helpful for me if you can provide me your code so i can look upto it.
Assuming the text is in cell A1, you could try it by combining TEXT with REGEXEXTRACT and DATEVALUE.
=TEXT(DATEVALUE(REGEXEXTRACT(A1, "([A-Za-z]{3}) (\d{1,2}), (\d{4})")), "MM/DD/YYYY")
Had the same problem, solved with ng-init
and active
:
<tabset>
<tab
ng-repeat="tab in tabs"
ng-init="tab.active = $index === 0"
active="tab.active"
heading="{{ tab.heading }}"
>
</tab>
<tab heading="Foo">
</tab>
</tabset>
Even Wordpress' interface is broken. When you update a user and you click the link to return to the search result at the top you're taken to a link with the @ symbol removed.
I opened a bug report a long time ago but they seem to care more about making it difficult for us to ask for help than actually replicating the problem and fixing it when aware of issues.
Wordpress did NOT have this problem before. It started with an update released shortly before my original topic there.
Can anybody help and maybe join the conversation over there, so hopefully one of the developers maintaining Wordpress actually looks into it? Maybe they think I am a single lunatic with the issue and won't bother helping if they don't see others asking for help too.
I made it this way:
factory<ActionHandler> { (firstParamArgs: FirstParamArgs, secondParam: SecondParam) ->
MySecondViewModel(firstParamArgs, secondParam)
}
private val handler: ActionHandler by inject { parametersOf(inputParamArgs.firstParamArgs, inputParamArgs.secondParam) }
There isn't any other way to test the Google* Assistant integration end-to-end. You can install another supported version of Studio without downgrading the main version you use. See How to install multiple Android studio with different versions on same PC?
*You could write your own test Assistant app to test it, but that would be quite a lot of work.
The fact that your code is fine for user=portal but not for user=pubic indicates that the problem is due to access/security rules on model: 'supplier.registration.activity'
...
Could you check the rpc answer using console.log(records)
?
Use this URL to get cluster image:
https://raw.githubusercontent.com/googlearchive/js-marker-clusterer/refs/heads/gh-pages/images/m1.png
The most reliable workaround I have found is using the CMAKE_EXPORT_COMPILE_COMMANDS option. With this (but only when CppTools extension is installed), one can type IntelliSense in the Command Pallete, which offers the option to choose the data source. And then one can select the relevant compile_commands.json
file. Once this is set, the navigation through Ctrl-Click seems to work correctly within the source which belong to the selected build target.
A hero posted this answer on github, and after 5 hours of debugging and trying every single solution psoted in github and stackoverflow, finally this one solved the problem :
Per https://source.android.com/docs/core/architecture/hidl/binder-ipc#vndbinder
Normally, vendor processes don't open the binder driver directly and instead link against the
libbinder
userspace library, which opens the binder driver.
After clearing the files in the DriveData folder, can you build again and check?
Path is something like this
/Users/YOUR-USER-NAME/Library/Developer/Xcode/DerivedData/
Check the sizes prop at [NextJS Image]
[NextJS Image]: https://nextjs.org/docs/pages/api-reference/components/image
An example:
sizes="(max-width: 768px) 100vw, (max-width: 1200px) 50vw, 33vw"
Now it renders with good quality on larger devices and optimised version on smaller screens. Read the documentation for details!
We can create ColorDrawable and convert it as ImageBitmap.
Image(
modifier = Modifier.matchParentSize(),
bitmap = ColorDrawable(Color.Black.toArgb()).toBitmap(10, 10).asImageBitmap(),
contentScale = ContentScale.FillBounds,
contentDescription = null
)
The above answers are for Elasticsearch.
Follow this document for Opensearch. there is more you need to do to make it work:
https://docs.aws.amazon.com/opensearch-service/latest/developerguide/managedomains-snapshot-registerdirectory.html
Thank you Dario M. and mason - I cannot upvote your comment due to my lack of reputation points but your comment helped me fix my issue. Thank you!
you can use Specific FocusNode for the TextField and then after pressing the done button that FocusNode variable set to unfocus.
Microsoft Outlook and Azure logic apps interpret e-mail in a very strict manner.
I ended up using this resource to understand some aspects: https://learn.microsoft.com/en-us/exchange/mail-flow-best-practices/message-format-and-transmission
What I had trouble with was the strict handling of e-mails being sent by a black box type of software run by another party. They would send an e-mail with text in the body and also have an attachment of the same content. The e-mail had the body text encoded with the same type as the attachment: Content-Type: text/xml
Outlook and an Azure logic app were strict and interpreted this as an attachment even though it was where the body of a message would go. They auto assigned a filename to the attachment. Thunderbird and Gmail were more relaxed in the handling of the e-mail and when viewing the body section (even through it had a text/xml type, because it was where the body would be, they were more user friendly and interpreted it as plain text and just displayed the XML as the body of the message and then the attachment was handled correctly as an attachment.
My take away is: It appears Microsoft products are very strict in how they handle messages.
Now it's possible to compile C++ code into WASM and than use it in Go with the help of wazero runtime. No CGO is needed. Consider go-sqlite3 as an example.
Here's my fixing up of it: main.js
async function getLatestRelease(repo) {
const response = await fetch(`https://api.github.com/repos/${repo}/releases/latest`);
const data = await response.json();
return data.tag_name;
}
function downloadFile(url, filename) {
const link = document.createElement('a');
link.href = url;
link.download = filename;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
}
async function showPopup(type) {
let popupText = '';
let downloadLink = '';
if (type === 'dev') {
popupText = 'Thank you for downloading Comp_V3 *dev*! This download link goes to the Github API. This is the source code. <br> <br> You will need my Custom SDK to use this. Check out my other download in the navbar.';
downloadLink = 'https://github.com/RanchoDVT/Comp-V5/archive/refs/heads/dev.zip';
document.getElementById('popup-title').innerText = 'Download Comp-V3 ' + type;
} else if (type === 'stable') {
const latestTag = await getLatestRelease('RanchoDVT/Comp-V5');
popupText = 'Thank you for downloading Comp_V3 stable! This download link goes to the Github API. This is the source code. <br> <br> You will need my Custom SDK to use this. Check out my other download in the navbar.';
downloadLink = `https://github.com/RanchoDVT/Comp-V5/archive/refs/tags/${latestTag}.zip`;
document.getElementById('popup-title').innerText = 'Download Comp-V3 ' + type + ' ' + latestTag;
} else if (type === 'sdk') {
const latestTag = await getLatestRelease('RanchoDVT/Vex-SDK');
popupText = 'Thank you for downloading my custom SDK. This is unofficial and in no way affiliated, endorsed, supported, or created by VEX Robotics. <br> <br> You will need this to install my Custom SDK (This) to use my Comp_V3 Program. This modifies Vex\'s robotics extension, so PLEASE don\'t go to them if you have problems with this. Please contact me. <br> <br>There is a PowerShell script for this to make it easier: ';
popupText += '<a href="https://minhaskamal.github.io/DownGit/#/home?url=https://github.com/RanchoDVT/Vex-SDK/blob/dev/Vex-SDK.updater.ps1">Powershell download</a>';
document.getElementById('popup-title').innerText = 'Download Custom ' + type + ' ' + latestTag;
downloadLink = `https://github.com/RanchoDVT/Vex-SDK/archive/refs/tags/${latestTag}.zip`;
}
document.getElementById('popup-text').innerHTML = popupText; // Use innerHTML to render HTML content
document.getElementById('download-link').href = downloadLink;
document.getElementById('popup').classList.add('active');
document.getElementById('overlay').classList.add('active');
}
function hidePopup() {
document.getElementById('popup').classList.remove('active');
document.getElementById('overlay').classList.remove('active');
}
the navbar:
<nav>
<li><a class="nav-link" data-page="index.html" href="index.html">Home</a></li>
<li class="dropdown">
<a class="nav-link" data-page="projects.html" href="projects.html">Projects</a>
<div class="dropdown-content">
<a target="_blank" href="https://github.com/Voidless7125/Comp-V5">Comp V3</a>
<a target="_blank" href="https://github.com/RanchoDVT/Vex-SDK">Custom SDK</a>
<a target="_blank" href="https://ranchodvt.github.io/Comp-V5/">This website!</a>
</div>
</li>
<li class="dropdown">
<a class="nav-link">Downloads</a>
<div class="dropdown-content">
<a onclick="showPopup('stable')">Comp_V3 Stable</a>
<a onclick="showPopup('dev')">Comp_V3 Dev</a>
<a onclick="showPopup('sdk')">Custom SDK Stable</a>
</div>
</li>
<li><a class="nav-link" data-page="features.html" href="features.html">Features</a></li>
<li><a class="nav-link" data-page="contact.html" href="contact.html">Contact</a></li>
<li style="float: right;"><a class="nav-link" data-page="about.html" href="about.html">About</a></li>
</nav>
<!-- Pop-Up Structure -->
<div id="popup" class="popup">
<div class="popup-header">
<h2 id="popup-title">Download</h2>
</div>
<p id="popup-text"></p>
<button class="cancel-btn" onclick="hidePopup()">Cancel</button>
<a id="download-link" class="download-btn" href="#" download>Download</a>
</div>
<div id="overlay" class="overlay" onclick="hidePopup()"></div>
and the css:
.popup {
display: none;
position: fixed;
left: 50%;
top: 50%;
transform: translate(-50%, -50%);
width: 400px;
border: 1px solid #ccc;
padding: 20px;
background-color: #fff;
box-shadow: 0 2px 10px rgba(0, 0, 0, 0.1);
z-index: 1000;
border-radius: 8px;
}
.popup.active {
display: block;
background-color: black;
}
.popup-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 10px;
background-color: black;
}
.popup-header h2 {
margin: 0;
font-size: 18px;
background-color: black;
}
.download-btn, .cancel-btn {
display: inline-block;
margin-top: 10px;
padding: 10px 20px;
border: none;
border-radius: 4px;
cursor: pointer;
}
.download-btn {
background-color: #4CAF50;
color: white;
text-decoration: none;
text-align: center;
}
.cancel-btn {
background-color: #f44336;
color: white;
}
.overlay {
display: none;
position: fixed;
left: 0;
top: 0;
width: 100%;
height: 100%;
background: rgba(0, 0, 0, 0.5);
z-index: 999;
}
.overlay.active {
display: block;
}
You used a dependency that was built with .NET 4.0, if your project configuration is with .NET 2.0.. 3.5.
For Example : enter image description here
and config in project is .net 3.5 enter image description here
**
**
I'm struggling to read a growing MXF file in real time for a live sports streaming project. I can read the video in 5-minute chunks provided by the recording software, and I’m able to load the full file (around 750GB) once it's complete. However, I need to process the file as it’s still growing. Any suggestions on how to approach this?
How to add Google Map API to Web
<script
src="https://maps.googleapis.com/maps/api/js?key=YOUR_API_KEY&loading=async&libraries=maps&v=beta" defer>
</script>
https://developers.google.com/maps/documentation/javascript/add-google-map
how i felt when bro said ancient : 𓀂𓀬𓁅𓁅𓃂𓄆𓃻
You may try:
=LET(a,TOCOL(C7:E27,1),SORT(UNIQUE(ARRAYFORMULA(TOCOL(a,1)+TOCOL(ARRAYFORMULA(6-TOCOL(ARRAYFORMULA(WEEKDAY(a)),1)),1)))))
Output:
Reference
1 Build you project in release mode.
2 Copy the .exe from the release folder to the folder you wond to deploy e.g C:\path\to\folder\deployfolder.
3 Open the tools for example: Qt 6.5.3 (MinGW 11.2.0 64-bit).
4 Write command
winddeployqt C:\path\to\folder\deployfolder and that's it.
Is the issue resolved . I am also facing the same post angular 18 migration. Please help
I tried your code and it seems to be missing and it has an error. I modified and converted your python
into Javascript
and added the forEach()
function to iterate each data if the ID is matched in your array it will return the CRYPTO value of EURO as to match in your script json.data[x].quote.EUR.price
.
Code.gs
function getCryptoData(id) {
var url = "https://pro-api.coinmarketcap.com/v1/cryptocurrency/listings/latest?convert=EUR";
var apiKey = 'xyz'; // Replace with your API key
var headers = {
"X-CMC_PRO_API_KEY": apiKey,
"Accept": "application/json"
};
var response = UrlFetchApp.fetch(url, { 'headers': headers });
var json = JSON.parse(response.getContentText());
json.data.forEach(x => {
if (x.id == id) {
console.log(x.name, x.quote.EUR.price)
return x.quote.EUR.price
}
})
}
Sample Output:
Reference:
@aled Thank you for your answer! I found it extremely helpful, even though I was barking up the wrong tree and asking the wrong questions. The solution I found was to add another ee:transform component after the wsc:consume component. I was also able to simplify my existing ee:transform so that I didn't have to map each individual element. The Add:\soapkit-config flow now looks like this:
<flow name="Add:\soapkit-config">
<ee:transform doc:name="Transform Input">
<ee:message>
<ee:set-payload><![CDATA[%dw 2.0
output application/xml
ns ns0 http://tempuri.org/
---
payload.body]]></ee:set-payload>
</ee:message>
</ee:transform>
<wsc:consume config-ref="Web_Service_Consumer_Config" operation="Add" doc:name="Outside Consumer" />
<ee:transform doc:name="Transform Output">
<ee:message>
<ee:set-payload><![CDATA[%dw 2.0
output application/xml
ns ns0 http://tempuri.org/
---
payload.body]]></ee:set-payload>
</ee:message>
</ee:transform>
</flow>
When I put logging components in, I can see that after the Transform Output, the payload looks like this:
<?xml version='1.0' encoding='UTF-8'?>
<AddResponse xmlns="http://tempuri.org/">
<AddResult>579</AddResult>
</AddResponse>
... then control flows back to the SOAP Router which puts the payload into a SOAP Body, so it looks like this:
<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope">
<soap:Body>
<AddResponse xmlns="http://tempuri.org/">
<AddResult>579</AddResult>
</AddResponse>
</soap:Body>
</soap:Envelope>
try changing local.settings.json
"FUNCTIONS_WORKER_RUNTIME": "dotnet-isolated"
You can create profile-specific configuration files, like application-local.properties and application-sandbox.properties, where you define the logback configurations for "local" and "sandbox" profiles respectively.
Place the shared properties in application.properties, which will be loaded across all profiles. The application-local.properties and application-sandbox.properties files will inherit from application.properties and can override specific settings as needed.
This post looks old so for newer users needing to update the client requests.
If using the most up to date spring security stuff, it will use a webclient in the backend to actual talk with the Reddit auth service. You will need to follow the instructions here https://docs.spring.io/spring-security/reference/reactive/oauth2/index.html#oauth2-client-customize-web-client, passing in the customized WebClient that sets the default header to the oauth clients to use when making the request.
Of course it was a problem with the lifecycle of my activity. When I use ViewTreeObserver.OnGlobalLayoutListener
instead of .post
, the coordinates are returned perfectly.
charlesparker@Charless-Mini KandR % ls -l /usr/bin/gcc -rwxr-xr-x 77 root wheel 119008 Aug 4 06:31 /usr/bin/gcc charlesparker@Charless-Mini KandR % which cc /usr/bin/cc charlesparker@Charless-Mini KandR % ls -l /usr/bin/cc -rwxr-xr-x 77 root wheel 119008 Aug 4 06:31 /usr/bin/cc charlesparker@Charless-Mini KandR % which clang /usr/bin/clang charlesparker@Charless-Mini KandR % ls -l /usr/bin/clang -rwxr-xr-x 77 root wheel 119008 Aug 4 06:31 /usr/bin/clang
Mac mini 2018 Macos Sonoma 14.6.1
I credit most of the logic for this solution to @codtex and @stritch000. I noticed that the solution from @stritch000 uses the same adjustment rule for both the current and next year lookups, which might produce incorrect results if the adjustments change after the first year.
Here is an updated solution that addresses the next-year lookup while also preserving the southern hemisphere fix. It also uses Linq. If you plan to run this for many dates, you could cache the results of the Linq query for a given year/timezone in a dictionary
public static DateTime? GetNextTransition(DateTime asOfTime, TimeZoneInfo timeZone)
{
var getAdjs = from adj in timeZone.GetAdjustmentRules()
from yr in (int[])[asOfTime.Year, asOfTime.Year + 1]
from t in (TimeZoneInfo.TransitionTime[])[adj.DaylightTransitionStart, adj.DaylightTransitionEnd]
where adj.DateStart.Year <= yr && adj.DateEnd.Year >= yr
select GetAdjustmentDate(t, yr);
if (getAdjs.Where(a => a > asOfTime).Any())
{
return getAdjs.Where(a => a > asOfTime).Min();
}
return null;
}
public static System.Globalization.Calendar cal = System.Globalization.CultureInfo.CurrentCulture.Calendar;
public static DateTime GetAdjustmentDate(TimeZoneInfo.TransitionTime transitionTime, int year)
{
if (!transitionTime.IsFixedDateRule)
{
int minDate = transitionTime.Week * 7 - 6; //1, 8, 15 ... This is the earliest date that works for transition.
var minDateDayOfWeek = cal.GetDayOfWeek(new DateTime(year, transitionTime.Month, 1)); //the day for minDate
int dayDiff = (transitionTime.DayOfWeek - minDateDayOfWeek + 7) % 7;
int transitionDay = minDate + dayDiff;
if (transitionDay > cal.GetDaysInMonth(year, transitionTime.Month))
transitionDay -= 7;
return new DateTime(year, transitionTime.Month, transitionDay, transitionTime.TimeOfDay.Hour, transitionTime.TimeOfDay.Minute, transitionTime.TimeOfDay.Second);
}
else
{
return new DateTime(year, transitionTime.Month, transitionTime.Day);
}
}
Turns out that I had a rogue tab related to a work project.
Normally, stopping the dev server closes all related tabs. . . except maybe it didn't, and that remaining tab kept polling for the server. So when I started a new server for a different proj, it was polling on the same port and getting 404s back.
try adding ignore scripts --> "npm i node-sass --ignore-scripts"
You don't need to use .tabViewStyle(.page(indexDisplayMode: .never)) modifier to achieve this, you can do this by binding the selectedTab with TabView.
struct ContentView: View {
@State var selectedTab = 0
var body: some View {
ZStack(alignment: .bottom) {
TabView(selection: $selectedTab) {
HomeView(page: 0)
.tag(0)
HomeView(page: 1)
.tag(1)
HomeView(page: 2)
.tag(2)
HomeView(page: 3)
.tag(3)
HomeView(page: 4)
.tag(4)
}
RoundedRectangle(cornerRadius: 25)
.frame(width: 350, height: 70)
.foregroundColor(.white)
.shadow(radius: 0.8)
HStack {
ForEach(0..<5, id: \.self) { index in
Button {
selectedTab = index
} label: {
CustomTabItem(imageName: "cross", title: "Item \(index)", isActive: (selectedTab == index))
}
}
}
.padding(.horizontal, 30)
.frame(height: 70)
}
}
@ViewBuilder func CustomTabItem(imageName: String, title: String, isActive: Bool) -> some View{
VStack(alignment: .center) {
HStack(alignment: .center) {
Spacer()
Image(systemName: imageName)
.resizable()
.renderingMode(.template)
.foregroundColor(isActive ? .purple : .gray)
.frame(width: 25, height: 25)
Spacer()
}
Text(title)
.foregroundColor(isActive ? .purple : .gray)
}
}
}
struct HomeView: View {
var page: Int
var body: some View {
NavigationView {
VStack {
NavigationLink(destination: HomeDetailView()) {
Text("Tab \(page) is selected")
}
}
.background(.red)
}
.navigationViewStyle(.stack)
}
}
struct HomeDetailView: View {
var body: some View {
Rectangle()
.fill(.orange)
}
}
May I add an additional scenario to discuss in this thread?
take snapshot A of the EBS volume
add some data/file and change empty blocks on the EBS volume
delete/remove data/file from the EBS volume
take snapshot B of the EBS volume
Are changed blocks from step 2 switched to empty ones and they are not included into the snapshot B OR they are considered as changed and are included in the snapshot B although they are empty?
There's a simpler solution I believe that I found in this medium article: https://kulembetov.medium.com/preventing-flash-of-unstyled-content-fouc-in-next-js-applications-61b9a878f0f7
Basically, it consists of adding visibility: hidden; to the body element by default and then adding the visibility back client side once the main layout has mounted (like so for instance: document.body.classList.add('body-visible');
If you are using NewtonsoftJson, make sure to add Nuget package reference to Microsoft.AspNetCore.Mvc.NewtonsoftJson also.
I am also on Windows. I use --force-polling
and it works. I got the solution from this Github issue
This flag forces Liquid to use polling rather than inotify. So, note that it is more CPU intensive.
I faced this issue and was able to resolve it by updating my typescript version; I'm now currently using "typescript": "^5.6.3"
. Try that and see if it works. Good luck!
It is unprofessional that only 15 days have passed between the announcement of the deprecation of the functionality and its removal.
You need a the micro python from pimoroni to work with the GFX PACK
I can only use REST API's to use instance_config to exclude fields. It is still not working with Python APIs. Would appreciate someone's help as there is no code samples for this feature from Vertex AI.
go to : Control Panel -> Credential Manager -> Generic Credentials
choose the edit option and update your "User name" and "Password"
Check if you have mySQL installed and not just the workbench
I have less than 50 reputations, but I need to ask SUBHRA SANKHA, if he was able to find a way around this problem.
Note: 🫵 You can help. This issue has been filed as Gerrit bug 40015217. Please +1 it and upvote this question to help it get the attention it deserves.
File rename operations in a git commit are identified by computing the similarity (percentage of lines unchanged) of pairs of files in the commit—one that was deleted and one that was added. If the similarity exceeds some threshold, the file deletion and file addition together are considered a file rename instead. Git’s default similarity threshold for rename detection is 50%. This is clearly documented and Google knows it well.
Gerrit uses JGit, and for some reason its similarity threshold for rename detection is 60% and has been since at least 2010 (commit 978535b). What’s more, the threshold isn’t customizable. jgit diff
has a -M
for detecting renames, but it doesn’t accept a custom threshold.
Here are some of the problems that Gerrit users face as a result of Gerrit using a different rename detection threshold than git:
Authors can usually work around this issue by splitting a file rename + edits into multiple commits. In some compiled languages like Java, file renames usually require some edits (such as to the package
line and/or class name) for the file to continue compiling. The minimum amount of edits required to appease the compiler usually keep a file’s similarity index over 60% though. Other edits can come in separate (prior or follow-up) commits.
Authors may not realize, however, that Gerrit won’t properly identify their file rename until after they’ve prepared the commit, written a commit title and description, and uploaded it for review. Their local git installation and git tools (such as IntelliJ’s git integration, for example) properly identified the file rename before they uploaded the commit to Gerrit. At this point, reworking the commit to work around a Gerrit limitation may have significant time cost, since the commit may have several more on top of it, and all of them might need to be rebased and have acceptance tests run again as a result. In short, the workaround may not always be quick and low-cost.
This issue has been reported and discussed in several other places.
It seems that BitBucket (or Stash) recently made the copy/rename similarity thresholds configurable (BSERV-3249). They use git rather than JGit.
IntelliJ/IDEA (a Java application) once tried using JGit for its git plugin, but concluded in this IJPL-88784 comment (emphasis added):
My comments given 18 months ago are still valid. JGit is still below Git. Moreover, you might be surprised, but we actually gave JGit a try: we used it in IDEA 11.X and 12.0 for HTTP remote operations. Our users got a lot of problems that were not easy to fix or even to reproduce, so we've rolled back to native Git. If other projects are happy with JGit, that's their funeral, we have our own vision on the subject.
So it is not because we are lazy to rewrite the plugin. We just don't want to fix issues for the JGit team, and on the other hand we can't say users "blame JGit" if they experience problems with IDEA which they don't have in the command line: they will still blame IDEA, because they don't care which library do we use inside.
I've solved installing ipywidgets
pip install ipywidgets
In my case export DOCKER_HOST="unix://${HOME}/.rd/docker.sock"
also worked. If it works for you, be sure to add it to your shell startup. No need to enable administrative access in Rancher Preferences. Apparently default DOCKER_HOST is /var/run/docker.sock
.
check redirect URI from google api console and try with updating expo configarations
If you are using MIUI. Turn off the MIUI optimization. For this: Settings > Developer options > MIUI optimisation
The ElastiCache service is designed to be accessed exclusively from within AWS.
If you want to access it from your local machine, the easiest and the cheapest way is to you use AWS SSM Start-Session with port forwarding (https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-working-with-sessions-start.html#sessions-remote-port-forwarding).
First install the Session Manager plugin for AWS CLI (https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-working-with-install-plugin.html)
Login via CLI to AWS and run:
aws ssm start-session \
--target instance-id \
--document-name AWS-StartPortForwardingSessionToRemoteHost \
--parameters '{"host":["redis-host.us-east-2.elasticache.amazonaws.com"],"portNumber":["6379"], "localPortNumber":["6379"]}'
Then you can access your redis on localhost:6379
. Make sure to test first with tls
disabled.
That was false positive case and it's already resolved https://github.com/detekt/detekt/issues/3145
As an enhancement, the check could look at the actual byte code generated. If the spread operator leads to a new array instantiation, the spread operator is a performance issue and should be reported. If the spread operator passes through an existing array, the spread operator is not a performance issue and should not be reported.
At my company we just started using .jenkinsfile as an extension.
build.jenkinsfile
and deploy.jenkinsfile
and so forth. It's worked well and our IDE's are able to parse the syntax just fine.
I hade same problem today, Try open Resource file with legacy and add files. Righ click Resource file => Open With => (Legacy) Hope this helps,
There's a new feature in terraform 1.9 that can help here, a variable validation can now refer to another variable: https://github.com/hashicorp/terraform/blob/v1.9/CHANGELOG.md#190-june-26-2024
Events are fired if you call delete
on each model, not on the builder.
//events not fired
Submission::query()->delete();
//events fired
Submission::query()->get()->each->delete();
Live code example in laravel sandbox: https://sandbox.ws/en/shared/e0eadb68-f145-46a2-9981-188ee3c34e8a
People why so complicated?
const ExtractDomain = url => url.includes('/') ? url.split('/')[2] : '';
I'm starting to think it's something global or something.
I started having this problem 2 or 4 days ago, nothing changed in my app, however today when I tried to do an npm run build
, I get the same thing.
SyntaxError: Unexpected identifier '#Y'.
at wrapSafe (node:internal/modules/cjs/loader:1427:18)
at Module._compile (node:internal/modules/cjs/loader:1449:20)
at Module._extensions.js (node:internal/modules/cjs/loader:1588:10)
at Module.load (node:internal/modules/cjs/loader:1282:32)
at Module._load (node:internal/modules/cjs/loader:1098:12)
at TracingChannel.traceSync (node:diagnostics_channel:315:14)
at wrapModuleLoad (node:internal/modules/cjs/loader:215:24)
at Module.require (node:internal/modules/cjs/loader:1304:12)
at mod.require (E:\git-challenge-traveler-visitor-visitant_modules_modules_next_distress-server-require-hook.js:65:28)
in require (node:internal/modules/helpers:123:16)
> A compilation error has occurred
Error: Failed to collect page data for /api/dashboard/create-company
at E:\git-challenge-traveler-visitant “client” modules “nextdistinct”.js:1268:15
in process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
type: 'Error'
}
I tried doing the build on past commits where the site was working fine, and it's still the same. Does anyone know where one could report such an error, I think probably in the vercel forum, however I was asked what I did and the truth is that nothing.
I was insanely jealous when Van Jacobson of LBL used my kernel ICMP support to write TRACEROUTE, by realizing that he could get ICMP Time-to-Live Exceeded messages when pinging by modulating the IP time to life (TTL) field. I wish I had thought of that! :-) Of course, the real traceroute uses UDP datagrams because routers aren't supposed to generate ICMP error messages for ICMP messages.
it is working in live environment be sure that you visit the link http://localhost/opencart/index.php?route=custom/simplejson without any .PHP extension at the end
Problem solved, for those who need or have a similar problem I share the code. Thanks to everyone for commenting and responding.
<?php
include("conexion.php"); // Asumiendo que tu conexión está configurada aquí
require __DIR__ . "/vendor/autoload.php";
$objCon = new Conexion();
use PhpOffice\PhpSpreadsheet\Spreadsheet;
use PhpOffice\PhpSpreadsheet\IOFactory;
use PhpOffice\PhpSpreadsheet\Cell\Coordinate;
use Monolog\Level;
use Monolog\Logger;
use Monolog\Handler\StreamHandler;
use Monolog\Handler\FirePHPHandler;
// Configuración básica para mostrar todos los errores
error_reporting(E_ALL);
ini_set('display_errors', 'On');
//Guardando el tipo_actividad e id_reporte
$tipo_Actividad=$_POST['tipo_actividad'];
$id_reporte = $_POST['id_reporte'];
class Registro
{
public $interno;
public function __construct($data)
{
$this->interno = [
'doc_identidad' => $data['documento_identidad'],
'datos_adicionales' => [
'nombre_completo' => $data['apellidos_nombres_interno'],
'fecha_ingreso' => $data['fecha_ingreso'],
'fecha_nac' => $data['fecha_nacimiento'],
'discapacidad' => $data['discapacidad'],
'planificacion_intervencion' => $data['planificacion_intervencion'],
'tipo_intervencion' => $data['tipo_intervencion'],
'grupo_especifico' => $data['grupo_especifico'],
'otros_relevantes' => $data['otros_relevantes'],
'regimen' => $data['regimen'],
'etapa' => $data['etapa'],
'pabellon' => $data['pabellon'],
'descripcion' => mb_convert_encoding($data['descripcion'], "UTF-8"),
'dia' => $data['dia'],
'nombre_sesion' => mb_convert_encoding($data['nombre_sesion'], "UTF-8"),
'profesional' => $data['datos_profesional'],
'observaciones' => $data['observaciones']
// ... otros datos adicionales
]
];
// $this->nombre_completo = $data['apellidos_nombres_interno'];
}
}
class GeneradorReporteExcel
{
private $spreadsheet;
private $sheet;
private $logger;
public function __construct()
{
$this->spreadsheet = new Spreadsheet();
$this->sheet = $this->spreadsheet->getActiveSheet();
//Logs
$this->logger = new Logger('my_app');
$this->logger->pushHandler(new StreamHandler(__DIR__ . '/reportes/debug.log', Level::Debug));
// $this->logger->info('My logger is now ready');
$this->logger->pushHandler(new FirePHPHandler());
}
private function obtenerValor($registro, $columna, $fila, $tipoActividad)
{
switch ($columna) {
case 'C':
return $registro->interno['doc_identidad'];
case 'D':
return $registro->interno['datos_adicionales']['nombre_completo'];
case 'E':
return 'MASCULINO';
case 'F':
return $registro->interno['datos_adicionales']['fecha_ingreso'];
case 'G':
return $registro->interno['datos_adicionales']['fecha_nac'];
case 'H':
// Obtener la celda de fecha de nacimiento (suponiendo que 'G2' es relativa)
$celdaFechaNacimiento = 'G' . $fila; // Ajusta la fila según tu lógica
// Construir la fórmula completa
$formula = "=(NOW()-" . $celdaFechaNacimiento . ")/365-0.5";
return $formula;
case 'I':
return $registro->interno['datos_adicionales']['discapacidad'];
case 'J':
//planificacion d ela intervencion
if ($registro->interno['datos_adicionales']['planificacion_intervencion'] == 1) {
$planificacion_intervencion = "PTI_EN_PROCESO";
}
if ($registro->interno['datos_adicionales']['planificacion_intervencion'] == 2) {
$planificacion_intervencion = "PTI_P";
}
if ($registro->interno['datos_adicionales']['planificacion_intervencion'] == 3) {
$planificacion_intervencion = "PTI_S";
}
if ($registro->interno['datos_adicionales']['planificacion_intervencion'] == 4) {
$planificacion_intervencion = "POPE_ANTIGUA";
}
return $planificacion_intervencion;
case 'K':
//tipo de intervencion
if ($registro->interno['datos_adicionales']['tipo_intervencion'] == 1) {
$tipo_intervencion = "INDUCCION_Y_ADAPTACION_AL_REGIMEN_PENITENCIARIO";
}
if ($registro->interno['datos_adicionales']['tipo_intervencion'] == 2) {
$tipo_intervencion = "INTERVENCION_GENERAL";
}
if ($registro->interno['datos_adicionales']['tipo_intervencion'] == 3) {
$tipo_intervencion = "INTERVENCION_ESPECIALIZADA";
}
if ($registro->interno['datos_adicionales']['tipo_intervencion'] == 4) {
$tipo_intervencion = "PROGRAMACIÓN_SOBRE_NECESIDADES_DE_INTERVENCIÓN_COMPLEMENTARIA";
}
return $tipo_intervencion;
case 'L':
return $registro->interno['datos_adicionales']['grupo_especifico'];
case 'M':
return $registro->interno['datos_adicionales']['otros_relevantes'];
case 'N':
return $registro->interno['datos_adicionales']['regimen'];
case 'O':
return $registro->interno['datos_adicionales']['etapa'];
case 'P':
return $registro->interno['datos_adicionales']['pabellon'];
case 'Q':
case 'S':
case 'U':
case 'W':
case 'Y':
case 'AA':
case 'AC':
if ($tipoActividad === 1) {
$descripcion=$registro->interno['datos_adicionales']['descripcion']; // O el campo de descripción que corresponda
} else {
$descripcion='';
}
return $descripcion;
case 'R':
case 'T':
case 'V':
case 'X':
case 'Z':
case 'AB':
case 'AD':
if ($tipoActividad === 1) {
$dia=$registro->interno['datos_adicionales']['dia']; // O el campo de descripción que corresponda
} else {
$dia='';
}
return $dia;
case 'AE':
case 'AG':
case 'AI':
case 'AK':
case 'AM':
if ($tipoActividad == 2) {
// Manejar el caso cuando tipoActividad es 2 (si aplica)
$nombre_sesion=$registro->interno['datos_adicionales']['nombre_sesion']; // O el campo de nombre de sesión que corresponda
} else {
$nombre_sesion='';
}
return $nombre_sesion;
case 'AF':
case 'AH':
case 'AJ':
case 'AL':
case 'AN':
if ($tipoActividad == 2) {
// Manejar el caso cuando tipoActividad no es 1 ni 2
$dia=$registro->interno['datos_adicionales']['dia']; // O el campo de nombre de sesión que corresponda
} else {
$dia='';
}
return $dia;
case 'AO':
return $registro->interno['datos_adicionales']['profesional'];
case 'AP':
return $registro->interno['datos_adicionales']['observaciones'];
// case 'AQ':
// return $registro->interno['datos_adicionales']['nombre_completo'];
default:
return '';
}
}
private function escribirEnCelda($columna, $fila, $valor)
{
// $this->logger->debug("Escribiendo $valor en la celda $columna$fila");
// $this->sheet->setCellValue($columna . $fila, $valor);
try {
// $this->logger->info('My logger is now ready');
// $this->logger->debug("Escribiendo $valor en la celda $columna$fila");
$this->sheet->setCellValue($columna . $fila, $valor);
} catch (\Exception $e) {
$this->logger->error("Error al escribir en la celda: " . $e->getMessage());
}
}
private function getRangoColumnas($tipoActividad, $esNuevoRegistro)
{
if($tipoActividad==1){
return $esNuevoRegistro ? ['C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', 'AA', 'AB', 'AC', 'AD'] : ['O', 'P', 'Q', 'R', 'S', 'T', 'U', 'V', 'W', 'X', 'Y', 'Z', 'AA', 'AB', 'AC', 'AD'];
}else{
return $esNuevoRegistro ? ['C', 'D', 'E', 'F', 'G', 'H', 'I', 'J', 'K', 'L', 'M', 'N', 'O', 'P', 'AE', 'AF', 'AG', 'AH','AI','AJ','AK','AL','AM','AN','AO','AP'] : ['AE', 'AF', 'AG', 'AH','AI','AJ','AK','AL','AM','AN'];
}
}
private function buscarFilaPorDni($dni)
{
// Suponiendo que el DNI está en la columna C
$highestRow = $this->sheet->getHighestRow();
// $this->logger->debug("Cuantas filas hay?: $highestRow");
for ($row = 2; $row <= $highestRow; $row++) {
if ($this->sheet->getCell('C' . $row)->getValue() == $dni) {
// $this->logger->debug("Fila del archivo Excel: $this->sheet->getCell('C' . $row)->getValue() == DNI: $dni");
$tipoActividad = 2; //$this->sheet->getCell('A' . $row)->getValue(); // Suponiendo que el tipo de actividad está en la columna A
$rangoColumnas = $this->getRangoColumnas($tipoActividad, false);
return [
'fila' => $row,
'rangoColumnas' => $rangoColumnas
];
}
// $this->logger->debug("DNI EXCEL: $this->sheet->getCell('C' . $row)->getValue() NO ES IGUAL A DNI SQL: $dni");
}
return false;
}
private function crearNuevaFila($fila, $registro, $rangoColumnas, $tipoActividad,$esNuevoRegistro) {
$columnaIndex = 0; // Inicializamos el índice de columna
foreach ($rangoColumnas as $columna) {
if($tipoActividad=2 && $esNuevoRegistro==true){
if($columnaIndex==16 || $columnaIndex==17 || $columnaIndex==18 || $columnaIndex==19 || $columnaIndex==20 || $columnaIndex==21 || $columnaIndex==22){
// $this->logger->debug("columnaIndex: $columnaIndex - Pertenece a la columna: $columna ---> NO SE REGISTRO");
continue;
}
}else{
}
$valor = $this->obtenerValor($registro, $columna, $fila, $tipoActividad);
$this->escribirEnCelda($columna, $fila, $valor);
// $this->logger->debug("columnaIndex: $columnaIndex - Pertenece a la columna: $columna ---> valor = $valor ");
$columnaIndex++;
}
}
private function actualizarFila($fila, $registro, $rangosAdicionales)
{
$tipoActividad = 2; //$registro->tipo_actividad;
// Encontrar la primera columna vacía en el rango adicional
$columnaVacia = null;
// $this->logger->debug("Rango: $rangosAdicionales[$tipoActividad]");
foreach ($rangosAdicionales as $columna) {
if ($this->sheet->getCell( $columna. $fila)->getValue() === null) {
$columnaVacia = $columna;
// $this->logger->debug("Columna Vacia: $columnaVacia");
break;
}
}
// Si se encontró una columna vacía, escribir el valor
if ($columnaVacia) {
// $columnaIndex = array_search($columnaVacia, $rangosAdicionales);
// $this->logger->debug("ColumnaIndex: $columnaIndex ");
$sesion =$registro->interno['datos_adicionales']['nombre_sesion']; // $this->obtenerValor($registro, $columna);
$this->escribirEnCelda($columnaVacia, $fila, $sesion);
//aumentando en 1 para el día
$columnaVacia++;
$dia=$registro->interno['datos_adicionales']['dia'];
$this->escribirEnCelda($columnaVacia, $fila, $dia);
}
}
public function generarReporte($registros, $rutaArchivo, $formatoArchivo,$tipoActividad)
{
// Cargar el archivo de formato
$this->spreadsheet = IOFactory::load($formatoArchivo);
$this->sheet = $this->spreadsheet->getActiveSheet();
$fila = 2;
foreach ($registros as $registro) {
$tipoActividad = $tipoActividad; //$registro->datos_adicionales['tipo_intervencion'];
// Determinar si es un nuevo registro o no ---> OK!
if ($this->buscarFilaPorDni($registro->interno['doc_identidad'])) {
$rangoColumnas = $this->getRangoColumnas($tipoActividad, false); // Registro existente
//hayando la fila repetida
// $dni = $registro->interno['doc_identidad'];
$resultadoBusqueda=$this->buscarFilaPorDni($registro->interno['doc_identidad'])['fila'];
// $this->logger->debug("FILA REPETIDA: ".$resultadoBusqueda);
// if (is_array($resultadoBusqueda)) {
// $filaExistente = $resultadoBusqueda[0];//$this->buscarFilaPorDni($dni)[0];
// $this->logger->debug("FILA REPETIDA: $filaExistente");
// }
$this->actualizarFila($resultadoBusqueda, $registro, $rangoColumnas);
// $this->logger->debug("BuscarFilaDNI devuelve estos valores: ".json_encode($this->buscarFilaPorDni($registro->interno['doc_identidad'])));
} else {
$rangoColumnas = $this->getRangoColumnas($tipoActividad, true); // Nuevo registro
$this->crearNuevaFila($fila, $registro, $rangoColumnas, $tipoActividad,true);
$fila++;
// $this->logger->debug("BuscarFilaDNI devuelve estos valores: ".$this->buscarFilaPorDni($registro->interno['doc_identidad']));
}
}
// Guardar el archivo Excel
$writer = IOFactory::createWriter($this->spreadsheet, 'Xlsx');
$writer->save($rutaArchivo);
}
}
// Consulta SQL
$sql = "SELECT I.doc_identidad AS 'DOCUMENTO_IDENTIDAD',
I.ape_nombres AS 'APELLIDOS_NOMBRES_INTERNO',
I.fecha_ingreso AS 'FECHA_INGRESO',
I.fecha_nacimiento AS 'FECHA_NACIMIENTO',
I.discapacidad AS 'discapacidad',
DET.planificacion_intervencion AS 'PLANIFICACION_INTERVENCION',
DET.tipo_intervencion AS 'TIPO_INTERVENCION',
GRUP.nombre_grupo_especifico AS 'GRUPO_ESPECIFICO',
DET.otros_relevantes AS 'OTROS_RELEVANTES',
REG.nombre_regimen AS 'REGIMEN',
ETP.nombre_etapa AS 'ETAPA',
I.pab_celda_etapa AS 'PABELLON',
DESCRIP.nombre_descripciones AS 'DESCRIPCION',
DET.dia_actividades AS 'dia',
SES.nombre_sesiones AS 'nombre_sesion',
CONCAT(U.ape_paterno,' ',U.ape_materno,' ',U.nombres) AS 'DATOS_PROFESIONAL',
DET.observaciones AS 'OBSERVACIONES',
DET.cambio_PTIP_PTIS AS 'PTI_P_PTI_S'
FROM reportes_mensuales REP
INNER JOIN detalle_reporte_mensual DET ON REP.id_reporte_mensual=DET.id_reporte_mensual
INNER JOIN grupo_especifico GRUP ON DET.id_grupo_especifico=GRUP.id_grupo_especifico
INNER JOIN internos I ON DET.id_interno=I.id_interno
INNER JOIN usuarios U ON REP.id_usuario=U.id_usuario
INNER JOIN regimen REG ON DET.id_regimen=REG.id_regimen
INNER JOIN etapa ETP ON DET.id_etapa=ETP.id_etapa
INNER JOIN descripciones DESCRIP ON DET.id_descripciones=DESCRIP.id_descripciones
INNER JOIN sessiones SES ON DET.id_sesiones=SES.id_sesiones
WHERE REP.id_reporte_mensual=:id_reporte"; // Tu consulta SQL completa
$rsDetalleReporte = $objCon->getConexion()->prepare($sql);
$rsDetalleReporte->bindParam(':id_reporte', $id_reporte, PDO::PARAM_INT);
if ($rsDetalleReporte->execute()) {
$registros = [];
while ($row = $rsDetalleReporte->fetch(PDO::FETCH_ASSOC)) {
$registro = new Registro($row);
$registros[] = $registro;
// Imprimir los datos del registro para verificar
// echo json_encode($registro)."<br>";
}
// Verificar si el array $registros está vacío
if (empty($registros)) {
echo "No se encontraron registros.";
} else {
// Generar el reporte
$generador = new GeneradorReporteExcel();
$generador->generarReporte($registros, 'reportes/reporte_actualizado.xlsx', 'reportes/FORMATO_SOCIAL.xlsx',$tipo_Actividad);
}
} else {
// Manejar errores en la ejecución de la consulta
echo "Error al ejecutar la consulta: " . $rsDetalleReporte->errorInfo()[2];
}
?>
I've already done it on my own. Using another library - SpaCy
MyCode:
import spacy
# Загружаем модель spaCy для русского языка
nlp = spacy.load("ru_core_news_sm")
# Применяем spaCy для сегментации
doc = nlp("Какая погода в Москве?")
# Печатаем все найденные сущности
for ent in doc.ents:
print(ent.text, ent.label_)
for /f “tokens=2 delims=:” %%a in (‘findstr “lines:” .\reports\publish\coverage.xml’) do set lines_coverage=%%a
for /f “tokens=2 delims=:” %%a in (‘findstr “functions:” .\reports\publish\coverage.xml’) do set functions_coverage=%%a
for /f “tokens=2 delims=:” %%a in (‘findstr “branches:” .\reports\publish\coverage.xml’) do set branches_coverage=%%a`
:: Remove whitespace characters and check the values
set lines_coverage=!lines_coverage: =!
set functions_coverage=!functions_coverage: =!
set branches_coverage=!branches_coverage: =!`
:: Check if coverage is greater than 90% for lines, functions, and branches
for /f “delims=.” %%a in (“!lines_coverage!”) do set lines_coverage_int=%%a
for /f “delims=.” %%a in (“!functions_coverage!”) do set functions_coverage_int=%%a
for /f “delims=.” %%a in (“!branches_coverage!”) do set branches_coverage_int=%%a`
if !lines_coverage_int! geq 90 (
echo Lines coverage is !lines_coverage!%%.
) else (
echo Lines coverage is below 90%%: !lines_coverage!%%
goto :fail
)
This is my code. Is there any issues with the if and else statements
Subscription policies are defined for specific subscriptions, that is, between a specific Application and an API. If you want to restrict access to a particular API after a certain number of requests, you may configure an Advance policy from the Admin portal and add restrictions for your API from the Publisher portal.
Yes, in Shopify, you can update the password for a customer using the Storefront API but not directly via the Admin API. This process requires sending a password reset email to the customer, where they can update their password themselves. Shopify does not provide direct access to customer passwords through the Admin API due to security concerns.
Unfortunately you have to use and pay for IntelliJ IDEA Ultimate to have code completion for Node.js, as you can see in the comparison of IntelliJ IDEA Ultimate and IntelliJ IDEA Community Edition: https://www.jetbrains.com/products/compare/?product=idea&product=idea-ce
In the IntelliJ IDEA Community Edition code completion for Node.js and JavaScript is not supported.
In my case, in application.properties
changing spring.security.oauth2.client.registration.google.scope=openid, profile, email
to spring.security.oauth2.client.registration.google.scope=profile, email
solved the problem for me.
Debian 12
apt install libmagickcore-6.q16-dev
To access protected members of a class you need to use the keyword this
as shown in the TypeScript documentation.