Development

Draw PowerBI Map from IP Addresses

Posted on

I had a request from a client, where he has an Excel sheet with list of IP addresses as well as other information, and he want to convert those IP address to the equivalent country and list in PowerBI.

Some Analytical tool like Splunk have this is a built in function, but for PowerBI you need to build it your self, and we will rely on a free web-service to implement this, so lets start

  1. Go to http://freegeoip.net/json/
  2. You should see the default data, which is your data retrieved by your IP address.
  3. Data displayed will be Country Code, Country Name, Region Code, Region Name, City, Zip Code(if applicable), Time Zone, Latitude, Longitude, and Metro Code
  4. if you entered specific IP address, correspondence data will be displayed, example: http://freegeoip.net/json/148.50.4.18
  5. Now, lets get back to PowerBI, Open your , Click Get Data, and then select Blank Query
  6. Click View, then Advanced Query, and add the Query in the right text
  7. Copy paste this Query and enter it in the text box of the Advanced Query
    let
    Source = (#"IP Address" as text) => let
    Source = Json.Document(Web.Contents("http://freegeoip.net/json/" & #"IP Address")),
    #"Converted to Table" = Record.ToTable(Source),
    #"Transposed Table" = Table.Transpose(#"Converted to Table"),
    #"Promoted Headers" = Table.PromoteHeaders(#"Transposed Table")
    in
    #"Promoted Headers"
    in
    Source
  8. Click Done, and rename the function with a meaningful name like : fn_GetRegion
  9. Now, you need to create a column with this function, to do so, from Edit Queries go to Add Column > Invoke Custom Function
  10. Enter the data as shown below
  11. The column will appear at the end of your table, now we need to Expand the column by click on the double arrow icons as in the snapshot below
  12. Now, all the columns is appeared clearly.
  13. and you can use it to draw your maps.

Thanks to guavaq who shows us the solution in this post

Advertisements

List SharePoint Online Sites in PowerShell using PnP

Posted on Updated on

How to Retrieving List of SharePoint Online Sites in PowerShell using PnP.

Currently PnP is considered to be the most modern way to deal with SharePoint Online and On-Prem, because it provides one interface for both

Quick handy script to do that, and you can take the output to write it in CSV file or import it in SharePoint List, or store it in any other repository

1. You need first to download PnP PowerShell release, from here
https://github.com/SharePoint/PnP-PowerShell/releases

2. Download the library, and un-zip the file
3. Now you will need to relocate the module files, you should find it in this paths
C:\Program Files (x86)\SharePointPnPPowerShellOnline\Modules\SharePointPnPPowerShellOnline\SharePointPnPPowerShellOnline.psd1 


$pnpmodulepath="C:\Program Files (x86)\SharePointPnPPowerShellOnline\Modules\SharePointPnPPowerShellOnline\SharePointPnPPowerShellOnline.psd1"
Import-Module $pnpmodulepath -Force
#Specify tenant admin and site URL
$User = "admin.tenat@withavanade.com"
$Password = ConvertTo-SecureString -String "password" -AsPlainText -Force
$Tenant = "https://o365accelerator-admin.sharepoint.com"
$CurrentCred = New-Object Management.Automation.PSCredential ($User, $Password)
Connect-PnPOnline -TenantAdminUrl $Tenant -Url $Tenant -Credentials $CurrentCred
$AllSites = Get-PnPTenantSite
foreach ($site in $AllSites )
{
if($site.Url.Contains("yourcompany.sharepoint.com")) #This if condition to dismiss public sites, but you can remove it
{
Connect-PnPOnline -Url $Site.Url -Credentials $CurrentCred
Write-Host ($site.Url +";"+
$site.Title +";"+
$site.Lcid +";"+
$site.LastContentModifiedDate +";"+
$site.LockState +";"+
$site.StorageMaximumLevel +";"+
$site.StorageUsage +";"+
"")
Disconnect-PnPOnline
}
}

Upload file to SharePoint Online using PowerShell

Posted on

This script help you to upload your file or document to SharePoint library using PowerShell

  1. Download SharePoint Online client library from here , and install it. (If you already have it, please ignore this step)
  2. Create a folder named Temp on your C: drive, and put the Excel/CSV file in it
  3. Modify the parameters below and run the below script.


#Specify tenant admin and site URL
$User = "site.admin@tenantname.onmicrosoft.com"
$Password = "YourPassword"
$SiteURL = "https://tenantname.sharepoint.com/sites/site"
$Folder = "C:\Temp"
$DocLibName = "DocumentLibraryName"
#Add references to SharePoint client assemblies and authenticate to Office 365 site - required for CSOM
Add-Type -Path "C:\Program Files (x86)\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.dll"
Add-Type -Path "C:\Program Files (x86)\SharePoint Online Management Shell\Microsoft.Online.SharePoint.PowerShell\Microsoft.SharePoint.Client.Runtime.dll"
#Bind to site collection
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Creds = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User,(ConvertTo-SecureString $Password -AsPlainText -Force));
$Context.Credentials = $Creds
#Retrieve list
$List = $Context.Web.Lists.GetByTitle($DocLibName)
$Context.Load($List)
$Context.ExecuteQuery()
#Upload file
Foreach ($File in (dir $Folder -File))
{
$FileStream = New-Object IO.FileStream($File.FullName,[System.IO.FileMode]::Open)
$FileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$FileCreationInfo.Overwrite = $true
$FileCreationInfo.ContentStream = $FileStream
$FileCreationInfo.URL = $File
$Upload = $List.RootFolder.Files.Add($FileCreationInfo)
$Context.Load($Upload)
$Context.ExecuteQuery()
}
#Upload completed

Export SharePoint Online Term Stores using PowerShell

Posted on Updated on

How to Export SharePoint Online Term Stores to CSV file, to allow you import it in any other environment ?

Unfortunately Microsoft Provides a simple out of the box way to import term stores using CSV template, but didn’t provide a way for exporting

Here below you will find a very simple Powershell Script that will allow you easy export your term stores so that you can clone it in any other tenant without hard efforts


Set-ExecutionPolicy -Scope CurrentUser Unrestricted
#Specify admin user and SharePoint site URL
#Update login
#Adding references to SharePoint client assemblies
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client")
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Runtime")
[Void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint.Client.Taxonomy")
$GroupName = "Add Here the Group Name"
$TermSetName = "Add Here the TermSet Name you want to export"
$siteUrl = "https://tenant.sharepoint.com"
$UserName = "Put Your UserName Here"
$Pwd = Read-Host -Prompt "Enter your password" -AsSecureString
#Recursive function to get terms
function GetTerms([Microsoft.SharePoint.Client.Taxonomy.Term] $Term,[String]$ParentTerm,[int] $Level)
{
$Terms = $Term.Terms;
$Context.Load($Terms)
$Context.ExecuteQuery();
if($ParentTerm)
{
$ParentTerm = $ParentTerm + "," + $Term.Name;
}
else
{
$ParentTerm = $Term.Name;
}
Foreach ($SubTerm in $Terms)
{
$Level = $Level + 1;
#up to 7 terms levels are written
$NumofCommas = 7 - $Level;
$commas ="";
For ($i=0; $i -lt $NumofCommas; $i++)
{
$commas = $commas + ",";
}
$file.Writeline("," + "," + "," + "," + $Term.Description + "," + $ParentTerm + "," + $SubTerm.Name + $commas );
GetTerms -Term $SubTerm -ParentTerm $ParentTerm -Level $Level;
}
}
$mycreds = New-Object System.Management.Automation.PSCredential ($UserName, $Pwd)
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($siteUrl)
$credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($mycreds.UserName, $mycreds.Password)
$Context.Credentials = $credentials
#$Context = New-Object Microsoft.SharePoint.Client.ClientContext($Site)
#$Credentials = New-Object System.Net.NetworkCredential($User,$Pwd);
#$Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($User, $Pwd)
#$Context.Credentials = $Credentials
$MMS = [Microsoft.SharePoint.Client.Taxonomy.TaxonomySession]::GetTaxonomySession($Context)
$Context.Load($MMS)
$Context.ExecuteQuery()
#Get Term Stores
$TermStores = $MMS.TermStores
$Context.Load($TermStores)
$Context.ExecuteQuery()
$TermStore = $TermStores[0]
$Context.Load($TermStore)
$Context.ExecuteQuery()
#Get Groups
$Group = $TermStore.Groups.GetByName($GroupName)
$Context.Load($Group)
$Context.ExecuteQuery()
#Bind to Term Set
$TermSet = $Group.TermSets.GetByName($TermSetName)
$Context.Load($TermSet)
$Context.ExecuteQuery()
#Create the file and add headings
$OutputFile = "D:\Out.txt"
$file = New-Object System.IO.StreamWriter($OutputFile)
$file.Writeline("Term Set Name,Term Set Description,LCID,Available for Tagging,Term Description,Level 1 Term,Level 2 Term,Level 3 Term,Level 4 Term,Level 5 Term,Level 6 Term,Level 7 Term");
$Terms = $TermSet.Terms
$Context.Load($Terms);
$Context.ExecuteQuery();
$lineNum = 1;
Foreach ($Term in $Terms)
{
if($lineNum -eq 1)
{
##output term properties on first line only
$file.Writeline($TermSet.Name + "," + $TermSet.Description + "," + $TermStore.DefaultLanguage + "," + $TermSet.IsAvailableForTagging + "," + $Term.Description + "," + $Term.Name + "," + "," + "," + "," + "," + "," );
}
else
{
$file.Writeline("," + "," + "," + "," + $Term.Description + "," + $Term.Name + "," + "," + "," + "," + "," + "," );
}
$lineNum = $lineNum + 1;
$TermTreeLevel = 1;
GetTerms -Term $Term -Level $TermTreeLevel -ParentTerm "";
}
$file.Flush();
$file.Close();

Get All Datbases in your SQL Server

Posted on Updated on

This SQL script is used to get all databases in your SQL Server with details as follow (Thanks to Tibor Karaszi):

  1. Database name
  2. Data allocated
  3. Dat used
  4. Log allocated
  5. Log used
  6. recovery model
  7. Instance name

To get please do the following:

1. Run the below in a new SQL query

USE [master]
GO
/****** Object: StoredProcedure [dbo].[sp_dbinfo] Script Date: 07/28/2017 12:24:30 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
Create PROC [dbo].[sp_dbinfo]
@sort char(1) = 'n'
,@include_instance_name char(1) = 'n'
AS
/***************************************************************************
@sort accept 4 values: 'n' (default), 'd', 'l' and 'r'.
@include_server_name accept 2 values, 'y' and 'n'.
It specifies the sort order (name, data allocated, log allocated, rollup only).Written by Tibor Karaszi 2009-12-29
Modified 2010-01-19, fixed data type for db name. Thanks csm!
Modified 2010-05-24, added support for offline databases. Thanks Per-Ivan N?und.
Modified 2011-07-21, SQL Server 11, use sysperfinfo instead of DBCC SQLPERF.
Modified 2011-09-23, master instead of MASTER, also qualified sysperfinfo.
Modified 2011-12-28, renamed to sp_dbinfo, added rollup option.
Modified 2013-02-19, added recovery model and option for instance name.
***************************************************************************/
SET NOCOUNT ON
DECLARE
@sql nvarchar(2000)
,@db_name sysname
,@recovery_model varchar(12)
,@crlf char(2)SET @crlf = CHAR(13) + CHAR(10)--Create tables to hold space usage stats from commands
CREATE TABLE #logspace
(
database_name sysname NOT NULL
,log_size real NOT NULL
,log_percentage_used real NOT NULL
)
CREATE TABLE #dbcc_showfilestats
(
database_name sysname NULL
,file_id_ int NOT NULL
,file_group int NOT NULL
,total_extents bigint NOT NULL
,used_extents bigint NOT NULL
,name_ sysname NOT NULL
,file_name_ nvarchar(3000) NOT NULL
)
--Create table to hold final output
CREATE TABLE #final_output
(
database_name sysname
,data_allocated int
,data_used int
,log_allocated int
,log_used int
,is_sum bit
)
--Populate log space usage
INSERT INTO #logspace(database_name, log_size, log_percentage_used)
SELECT
instance_name AS 'Database Name'
,MAX(CASE
WHEN counter_name = 'Log File(s) Size (KB)' THEN cntr_value / 1024.
ELSE 0
END) AS 'Log Size (MB)'
,MAX(CASE
WHEN counter_name = 'Percent Log Used' THEN cntr_value
ELSE 0
END) AS 'Log Space Used (%)'
FROM master..sysperfinfo
WHERE counter_name IN('Log File(s) Size (KB)', 'Percent Log Used')
AND instance_name != '_total'
GROUP BY instance_name
----Populate data space usage
DECLARE db CURSOR FOR SELECT name FROM sys.databases WHERE state_desc = 'ONLINE'
OPEN db
WHILE 1 = 1
BEGIN
FETCH NEXT FROM db INTO @db_name
IF @@FETCH_STATUS 0
BREAK
SET @sql = 'USE ' + QUOTENAME(@db_name) + ' DBCC SHOWFILESTATS WITH NO_INFOMSGS'
INSERT INTO #dbcc_showfilestats(file_id_, file_group, total_extents, used_extents, name_, file_name_)
EXEC (@sql)
UPDATE #dbcc_showfilestats SET database_name = @db_name WHERE database_name IS NULL
END
CLOSE db
DEALLOCATE db
--Result into final table
INSERT INTO #final_output(database_name, data_allocated, data_used, log_allocated, log_used, is_sum)
SELECT
CASE WHEN d.database_name IS NOT NULL THEN d.database_name ELSE '[ALL]' END AS database_name
,ROUND(SUM(CAST((d.data_alloc * 64.00) / 1024 AS DECIMAL(18,2))), 0) AS data_allocated
,ROUND(SUM(CAST((d.data_used * 64.00) / 1024 AS DECIMAL(18,2))), 0) AS data_used
,ROUND(SUM(CAST(log_size AS numeric(18,2))), 0) AS log_allocated
,ROUND(SUM(CAST(log_percentage_used * 0.01 * log_size AS numeric(18,2))), 0) AS log_used
,GROUPING(d.database_name) AS is_sum
FROM
(
SELECT database_name, SUM(total_extents) AS data_alloc, SUM(used_extents) AS data_used
FROM #dbcc_showfilestats
GROUP BY database_name
) AS d
INNER JOIN #logspace AS l ON d.database_name = l.database_name
INNER JOIN sys.databases AS sd ON d.database_name = sd.name
GROUP BY d.database_name WITH ROLLUP
--Output result
SET @sql = '
SELECT f.database_name, f.data_allocated, f.data_used, f.log_allocated, f.log_used, d.recovery_model_desc' +
CASE @include_instance_name WHEN 'y' THEN ', @@SERVERNAME AS instance_name' ELSE '' END + @crlf +
'FROM #final_output AS f LEFT OUTER JOIN sys.databases AS d ON f.database_name = d.name' + @crlf +
CASE WHEN @sort = 'r' THEN 'WHERE f.database_name = ''[ALL]''' ELSE '' END + @crlf +
'ORDER BY is_sum' + @crlf +
CASE
WHEN @sort = 'n' THEN ', database_name'
WHEN @sort = 'd' THEN ', data_allocated DESC'
WHEN @sort = 'l' THEN ', log_allocated DESC'
ELSE ''
END
--PRINT @sql
EXEC(@sql)

2. This will create a Procedure with the name dbo.sp_dbinfo.
To confirmed please Check Databases > System Datbases > master > Programmabiltiy > Stored Procedures > dbo.sp_dbinfo

3. Now run this command

USE [master]
GO
EXEC sp_dbinfo 'd', 'y'

you should see result like the below

you can also pick one of those commands for different output

--Test execution
/*
EXEC sp_dbinfo
EXEC sp_dbinfo 'n'
EXEC sp_dbinfo 'd'
EXEC sp_dbinfo 'l'
EXEC sp_dbinfo 'r'
EXEC sp_dbinfo 'n', 'y'
EXEC sp_dbinfo 'd', 'y'
*/

Good Luck !