I just deploy the following app. And i receive the error. Please help me to solve this issue.
$instance_name = "mysql:unix_socket=/cloudsql/My_Project_ID:google-cloud-instance";
$c = new mysqli(null, "root", "My_Password", 'My_DB_Name', 0, $instance_name);
if ($c->connect_error) {
echo $c->connect_error;
$c = null;
} else {
echo "Connected ";
}
And I receive the error in browser like
Unable to find the socket transport "unix" - did you forget to enable it when you configured PHP?
The socket argument doesn't say anything about a prefix:
$socket_path = "/cloudsql/My_Project_ID:google-cloud-instance";
$c = new mysqli(null, "my-user", "my-password", 'my-db', $socket_path);
Related
I've set up the Amazon SES on my ec2 instance as in this link
The test file is working fine and sending the email when I use the command php teste.php on the terminal on my ec2 instance, the problem is when I try to make a post request to access this file and send the email, it's throwing an error 500 and I don't know how to fix it.
here is the error message on chrome:
and on Insomnia:
here is my test file:
<?php
use PHPMailer\PHPMailer\PHPMailer;
use PHPMailer\PHPMailer\Exception;
// location of your Composer autoload.php file.
require '../../../home/ec2-user/vendor/autoload.php';
$sender = 'email#email.com';
$senderName = 'Teste';
$recipient = 'email#email.com';
$usernameSmtp = '[removed for security]';
$passwordSmtp = '[removed for security]';
$host = 'email-smtp.sa-east-1.amazonaws.com';
$port = 587;
$subject = 'Amazon SES test (SMTP interface accessed using PHP)';
$bodyText = "Email Test\r\nThis email was sent through the
Amazon SES SMTP interface using the PHPMailer class.";
$bodyHtml = '<h1>Email Test</h1>
<p>This email was sent through the
Amazon SES SMTP
interface using the <a href="https://github.com/PHPMailer/PHPMailer">
PHPMailer</a> class.</p>';
$mail = new PHPMailer(true);
try {
$mail->isSMTP();
$mail->setFrom($sender, $senderName);
$mail->Username = $usernameSmtp;
$mail->Password = $passwordSmtp;
$mail->Host = $host;
$mail->Port = $port;
$mail->SMTPAuth = true;
$mail->SMTPSecure = 'tls';
$mail->addCustomHeader('X-SES-CONFIGURATION-SET');
$mail->addAddress($recipient);
$mail->isHTML(true);
$mail->Subject = $subject;
$mail->Body = $bodyHtml;
$mail->AltBody = $bodyText;
$mail->Send();
echo "Email sent!", PHP_EOL;
} catch (phpmailerException $e) {
echo "An error occurred. {$e->errorMessage()}", PHP_EOL;
} catch (Exception $e) {
echo "Email not sent. {$mail->ErrorInfo}", PHP_EOL;
}
You need to open port 587 on your instance's security group to make it work.
A similar question asked on aws developer form
EDIT
using SSL
$mail->Host = 'ssl://email-smtp.sa-east-1.amazonaws.com';
$mail->SMTPSecure = 'tls';
$mail->Port = 443;
We would like to transfer the development via azure devops to another company and we ask ourselves whether not only new releases can be pushed through this pipeline. But also data could be downloaded from the productive environment via the azure devops or aws devops pipeline?
I researched myself but found nothing about it.
does any of you have more information on this?
Thank you
Is it possible to download files/data during the build pipeline on Azure DevOps?
In Azure DevOps, there isn't a task to download files/data, but you can use the PowerShell task to connect to FTP server and download files.
For detailed information, you can refer to this similar question.
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
#FTP Server Information - SET VARIABLES
$ftp = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
while(-not $reader.EndOfStream) {
$reader.ReadLine()
}
#$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$files = Get-FTPDir -url $folderPath -credentials $credentials
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Certificate comes from: PowerShell Connect to FTP server and get files.
You should use artifacts when it is inside your "environment".
Otherwise you can use the normal command line tools like git or curl and wget this depends on your build agent.
We have previously generated a list of Google's API end-points utilised by the SDK by grepping the source repo. Now that that doesn't seem to be available, has anyone else found a way of obtaining such a list? We need to be able to whitelist these end-points on our corporate firewall/proxy.
Thanks!
PART 1
If your objective is to whitelist URLs for your firewall, the URL *.googleapis.com will cover 99% of everything you need. There are only a few endpoints left:
bookstore.endpoints.endpoints-portal-demo.cloud.goog
cloudvolumesgcp-api.netapp.com
echo-api.endpoints.endpoints-portal-demo.cloud.goog
elasticsearch-service.gcpmarketplace.elastic.co
gcp.redisenterprise.com
payg-prod.gcpmarketplace.confluent.cloud
prod.cloud.datastax.com
PART 2
List the Google API endpoints that are available for your project with this command:
gcloud services list --available --format json | jq -r ".[].config.name"
https://cloud.google.com/sdk/gcloud/reference/services/list
Refer to PART 5 for a PowerShell script that produces a similar list.
PART 3
Process the Discovery Document which provides machine readable information:
Google API Discovery Service
curl https://www.googleapis.com/discovery/v1/apis | jq -r ".items[].discoveryRestUrl"
Once you have a list of discovery documents, process each document and extract the rootUrl key.
curl https://youtubereporting.googleapis.com/$discovery/rest?version=v1 | jq -r ".rootUrl"
PART 4
PowerShell script to process the Discovery Document and generate an API endpoint list:
Copy this code to a file named list_google_apis.ps1. Run the command as follows:
powershell ".\list_google_apis.ps1 | Sort-Object -Unique | Out-File -Encoding ASCII -FilePath apilist.txt"
There will be some errors displayed as some of the discovery document URLs result in 404 (NOT FOUND) errors.
$url_discovery = "https://www.googleapis.com/discovery/v1/apis"
$params = #{
Uri = $url_discovery
ContentType = 'application/json'
}
$r = Invoke-RestMethod #params
foreach($item in $r.items) {
$url = $item.discoveryRestUrl
try {
$p = #{
Uri = $url
ContentType = 'application/json'
}
$doc = Invoke-RestMethod #p
$doc.rootUrl
} catch {
Write-Host "Failed:" $url -ForegroundColor Red
}
}
PART 5
PowerShell script that I wrote a while back that produces similar output to gcloud services list.
Documentation for the API:
https://cloud.google.com/service-usage/docs/reference/rest/v1/services/list
<#
.SYNOPSIS
This program displays a list of Google Cloud services
.DESCRIPTION
Google Service Management allows service producers to publish their services on
Google Cloud Platform so that they can be discovered and used by service consumers.
.NOTES
This program requires the Google Cloud SDK CLI is installed and set up.
https://cloud.google.com/sdk/docs/quickstarts
.LINK
PowerShell Invoke-RestMethod
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-restmethod?view=powershell-5.1
Google Cloud CLI print-access-token Documentation
https://cloud.google.com/sdk/gcloud/reference/auth/print-access-token
Google Cloud API Documentation
https://cloud.google.com/service-infrastructure/docs/service-management/reference/rest
https://cloud.google.com/service-usage/docs/reference/rest/v1/services
https://cloud.google.com/service-infrastructure/docs/service-management/reference/rest/v1/services/list
#>
function Get-AccessToken {
# Get an OAuth Access Token
$accessToken=gcloud auth print-access-token
return $accessToken
}
function Display-ServiceTable {
Param([array][Parameter(Position = 0, Mandatory = $true)] $serviceList)
if ($serviceList.Count -lt 1) {
Write-Output "No services were found"
return
}
# Display as a table
$serviceList.serviceConfig | Select name, title | Format-Table -Wrap | more
}
function Get-ServiceList {
Param([string][Parameter(Position = 0, Mandatory = $true)] $accessToken)
# Build the url
# https://cloud.google.com/service-infrastructure/docs/service-management/reference/rest/v1/services/list
$url="https://servicemanagement.googleapis.com/v1/services"
# Build the Invoke-RestMethod parameters
# https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/invoke-restmethod?view=powershell-5.1
$params = #{
Headers = #{
Authorization = "Bearer " + $accessToken
}
Method = 'Get'
ContentType = "application/json"
}
# Create an array to store the API output which is an array of services
$services = #()
# Google APIs page the output
$nextPageToken = $null
do {
if ($nextPageToken -eq $null)
{
$uri = $url
} else {
$uri = $url + "?pageToken=$nextPageToken"
}
try {
# Get the list of services
$output = Invoke-RestMethod #params -Uri $uri
} catch {
Write-Host "Error: REST API failed." -ForegroundColor Red
Write-Host "URL: $url" -ForegroundColor Red
Write-Host $_.Exception.Message -ForegroundColor Red
return $services
}
# Debug: Display as JSON
# $output | ConvertTo-Json
# Append services to list
$services += $output.services
$nextPageToken = $output.nextPageToken
} while ($nextPageToken -ne $null)
return $services
}
############################################################
# Main Program
############################################################
$accessToken = Get-AccessToken
$serviceList = Get-ServiceList $accessToken
Display-ServiceTable $serviceList
Command-line tool JQ
Has anyone had any success migrating files from the Parse S3 Bucket to an S3 Bucket of their own? I have an app that contains many files (images) and I have them serving from both my own S3 Bucket and from the Parse Bucket using the S3 File Adapter but would like to migrate the physical files to my own Bucket on AWS where the app will now be hosted.
Thanks in advance!
If you've configured your new Parse instance to host files with the S3 file adapter, you could write a PHP script that downloads files from Parse S3 Bucket and upload it to your own. In my example (using Parse-PHP-SDK):
I do a loop through every entry.
I download the binary of that file (hosted in Parse)
I upload it as a new ParseFile (if your server is configured for S3, it will be uploaded to S3 bucket of your own).
Apply that new ParseFile to your entry.
Voilà
<?php
require 'vendor/autoload.php';
use Parse\ParseObject;
use Parse\ParseQuery;
use Parse\ParseACL;
use Parse\ParsePush;
use Parse\ParseUser;
use Parse\ParseInstallation;
use Parse\ParseException;
use Parse\ParseAnalytics;
use Parse\ParseFile;
use Parse\ParseCloud;
use Parse\ParseClient;
$app_id = "AAA";
$rest_key = "BBB";
$master_key = "CCC";
ParseClient::initialize( $app_id, $rest_key, $master_key );
ParseClient::setServerURL('http://localhost:1338/','parse');
$query = new ParseQuery("YourClass");
$query->descending("createdAt"); // just because of my preference
$count = $query->count();
for ($i = 0; $i < $count; $i++) {
try {
$query->skip($i);
// get Entry
$entryWithFile = $query->first();
// get file
$parseFile = $entryWithFile->get("file");
// filename
$fileName = $parseFile->getName();
echo "\nFilename #".$i.": ". $fileName;
echo "\nObjectId: ".$entryWithFile->getObjectId();
// if the file is hosted in Parse, do the job, otherwise continue with the next one
if (strpos($fileName, "tfss-") === false) {
echo "\nThis is already an internal file, skipping...";
continue;
}
$newFileName = str_replace("tfss-", "", $fileName);
$binaryFile = file_get_contents($parseFile->getURL());
// null by default, you don't need to specify if you don't want to.
$fileType = "binary/octet-stream";
$newFile = ParseFile::createFromData($binaryFile, $newFileName, $fileType);
$entryWithFile->set("file", $newFile);
$entryWithFile->save(true);
echo "\nFile saved\n";
} catch (Exception $e) {
// The conection with mongo or the server could be off for some second, let's retry it ;)
$i = $i - 1;
sleep(10);
continue;
}
}
echo "\n";
echo "¡FIN!";
?>
I want to use wamp as my development server and I'm trying to send email via my email => gmail, hotmail, yahoo. I'm trying to implement a simple email php application.
Is it possible to do it in wamp?
Is it possible to do it without changing php.ini and instead use ini_set();
I have tried changing my php.ini
using my yahoo mail
SMTP = smtp.mail.yahoo.com
; http://php.net/smtp-port
smtp_port = 587
auth_user = me#yahoo.com
auth_pass = password
and got this error message "Warning: mail() [function.mail]: SMTP server response: 530 authentication required - for help go to http://help.yahoo.com/help/us/mail/pop/pop-11.html in C:\wamp\www\9dot_disc_alt\abc.php on line 12"
using gmail
SMTP = smtp.gmail.com
; http://php.net/smtp-port
smtp_port = 587
auth_user = me#gmail.com
auth_pass = password
SMTP server response: 530 5.7.0 Must issue a STARTTLS command first. pc6sm6631754pbc.47 in C:\wamp\www\9dot_disc_alt\abc.php on line 12
Here's my current code:
$to = "me#yahoo.com";
$subject = "Test mail";
$message = "Hello! This is a simple email message.";
$from = "me#my.com";
$headers = "From:" . $from;
mail($to,$subject,$message,$headers);
echo "Mail Sent.";
Sir/Ma'am your answers would be of great help and be very much appreciated. Thank you++
When you use wamp, your SMTP must be your FAI, for example if you have free :
=>SMTP = smtp.free.fr (or .com)
EDIT : You can try this : http://glob.com.au/sendmail/, i's a simple windows console application that emulates sendmail's for wamp for example ;)
I found an online article that allows me to send emails using wamp + php mailer
http://nikunj-solutions.blogspot.com/2011/08/send-email-using-wamp-server.html