Skip to content

Programming & Mustangs!

A place for tutorials on programming and other such works.

  • AWS
    • AWS & Python
    • AWS & Node
    • AWS & Java
  • Azure
    • Azure: Python SDK
    • Azure Toolkit Installation for Eclipse
    • Azure: Install/Configure CLI
    • Azure: EventHub
    • Azure: Tags
    • Synapse
      • Synapse: Key Vault Linked Service
      • Synapse: Environment Variables
      • Synapse: List Python Packages
      • Synapse: Help Command
      • Synapse: Mounts
      • Synapse: Get Secret
      • Synapse: Notebook Exit
      • Synapse: Get KeyVault Properties Using Token Library
      • Synapse: SAS Token
  • CSS
    • CSS: FlexBox
    • CSS: Bootstrap DropDown From Text Click
    • CSS: Bootstrap Panel
    • CSS: Rotate Image
  • BigData
    • Databricks
      • Unity Catalog
        • Databricks Unity Catalog Rest API’s
        • Databricks Unity Catalog SQL Commands
      • Databricks: Python SDK
      • Databricks: Setup dbutils
      • Databricks: Notebook SQL
      • Databricks: Get Current User
      • Databricks: Check for Orphaned Workspace Directories
      • Databricks: Widgets Text
      • Databricks: Notebook Exit
      • Databricks: Get Secret
      • Databricks: Spark Check Table Exists
      • Databricks: Bearer Token CLI
      • Databricks: Mounts
      • Databricks: Notebook Commands
      • Databricks: Set Spark Configs
      • Databricks: Rest API
    • HortonWorks
    • Avro
      • Avro & Java: Record Parsing
      • Avro & Python: How to Schema, Write, Read
    • ElasticSearch
      • ElasticSearch Installation
      • ElasticSearch: Low Level Rest Client Connection
        • ElasticSearch: Low Level Client Put
        • ElasticSearch: Low Level Client Get
      • ElasticSearch: High Level Rest Client Connection
        • ElasticSearch: High Level Client Post
        • ElasticSearch: High Level Client Search
        • ElasticSearch: High Level Client Search Scrolling
    • Kafka
      • Kafka: Installation (Basic)
      • Kafka: Kerberize/SSL
      • Kafka & Java: Unsecure Producer Send Record
      • Kafka & Java: Secured Producer Send Record
      • Kafka & Java: Unsecure Consumer Read Record
      • Kafka & Java: Secured Consumer Read Record
      • Kafka & Java: Consumer List Topics
      • Kafka & Java: Consumer Seek To Beginning
    • Hadoop
      • Hadoop 2.9.1: Installation
      • Hadoop 3.2.0: Installation
      • HDFS/Yarn/MapRed: Kerberize/SSL
      • Hadoop: Rack Awareness
      • Hadoop: Add a New DataNode
      • Hadoop: Secondary NameNode
      • Hadoop: Commands
      • Hadoop & Java: Connect Remote Unsecured HDFS
      • Hadoop & Java: Connect to Remote Kerberos HDFS using KeyTab
      • Build a Java Map Reduce Application
    • HBase
      • HBase: Kerberize/SSL Installation
      • HBASE & Java: Connecting Secure
      • HBASE: Connecting Unsecure
      • HBASE & Java: Create a Table
      • HBASE & Java: Delete a Table
      • HBASE & Java: Search for Data
      • HBASE & Java: Scan Filters
      • Phoenix
        • Phoenix: Kerberize Installation
        • Phoenix & Java: Connecting Secure
        • HBASE Phoenix & Java: Unsecure Connection
    • Hive
      • Hive Kerberos Installation
      • Hive & Java: Connect to Remote Kerberos Hive using KeyTab
      • Hive: Database
      • Hive: Tables
      • Hive: Views
      • Hive: Variables
      • Hive: Map
      • Hive: Struct
      • Hive: Misc
    • Jupyter
      • Jupyter Installation
    • NiFi
      • NiFi Installation (Basic)
      • NiFi: Kerberize/SSL
      • NiFi: Custom Processor
      • NiFi: Rest API
      • NiFi: Kerberized Kafka Consumer Processor
      • NiFi: ExecuteSQL Processor
    • PIG
      • PIG: Testing
    • Sqoop2
      • Sqoop2: Installation
    • Spark
      • PySpark: StandAlone Installation on Windows
      • PySpark: Delta Lake
      • PySpark: Create a Spark Session
      • PySpark DataFrame Methods
      • PySpark: Create a DataFrame
      • PySpark: Save a DataFrame To ADLS
      • PySpark: Read From ADLS to DataFrame
      • Spark Connector Connect to SQL Server
      • PySpark: Eclipse Integration
      • Spark Installation on Hadoop
    • Zookeeper
      • Zookeeper Kerberos Installation
  • Scala
    • Scala: Basic Class Creation
    • Scala IDE Installation for Eclipse
  • Java
    • Dropwizard
      • Java: Basic Dropwizard Project
      • Dropwizard: Guice Bundle
      • Dropwizard: Command
      • Dropwizard: Resource
      • Dropwizard: Swagger Integration
    • Unit Testing
      • Java: JUnit 4 Example
      • Java: JUnit 4 /w PowerMock
    • Java: Input/Output Streams
    • Java: ExecutorService / Future
    • Java: Embed JavaScript
    • Java: Embed Python
    • Java: Enums
    • Java: Connect to Postgres
    • Java: String.format
    • Java: Command Line Arguments Parsing
    • Java: Jackson Json
    • Java: Read File From Resource
    • Java IDE Installation for Eclipse
      • Eclipse/Maven: CheckStyle Integration
      • Eclipse/Maven: Jacoco Integration
      • Eclipse/Maven: FindBugs/SpotBugs Integration
      • Eclipse/Maven: PMD Integration
  • JS
    • React, Babel + Eslint Configuration
    • JavaScript
      • Distinct Records in Object Array
      • Javascript: Variable Type
      • Javascript: Math Functions
      • Javascript: Find
      • Javascript: JSON
      • Javascript: Map
      • Javascript: Filter
      • JavaScript: Download Object Data
    • Enzyme
      • Enzyme
    • Underscore
      • JavaScript: Node & UnderscoreJs
    • Lodash
      • JavaScript: Node & Lodash
    • Leaflet
      • React: Basic Leaflet Map Example
      • React: Leaflet Modal
      • React: Leaflet EasyButton
      • React: Leaflet html Controls
      • React: Leaflet DomEvent
      • React: Leaflet LayerGroup
      • React: Leaflet Control
      • React: Leaflet Icon
      • React: Leaflet Markers
    • HighCharts
      • HighCharts: Basic Graphing
      • Highcharts: Add Custom Buttons
    • Node
      • AWS & Node
      • Simple Node WebApp
    • React
      • React Add CSS to Your Site
      • React: Page Layout
      • React: Export Module
  • .NET
    • C#: Connect to WebService
    • Dynamic Property Grid
      • PropertyGrid: DropDown Editor
      • PropertyGrid: DropDown Property
      • PropertyGrid: CheckBoxList Editor
      • PropertyGrid: CheckBoxList Property
  • OS
    • Ubuntu
      • Ubuntu: tmux
      • NGINX + UWSGI + Ubuntu 16
    • Shell: SED Command
    • Shell: Misc Commands
    • Shell: AWK Command
    • Shell: Functions
    • unRaid: Midnight Commander
    • VirtualBox
  • Python
    • Packages
      • Django
        • Django: React Website
      • Flask
        • Flask: React Website
        • Python: Flask SQLAlchemy
        • Python: Flask Resource
      • Python: MRJob
      • Python: lxml
      • Python: Arguments
      • Python: Multiprocessing Pool
      • Python: Logging
        • Python: Create a Logger
      • Python: xlrd (Read Excel File)
      • Python: pyodbc with SQL Server
    • Python: Selenium Tests
    • Python: Create in Memory Zip File
    • Python: For Loops
    • Python: Enums
    • Python: Connect To Hadoop
    • Python: Run Module as Script
    • Python: CSV from Array
    • Python: Environment Variables
    • Python: Working with DateTimes
    • Python: Get First Occurrence of Value in Array
    • Python: Run Process
    • Python: Create a WHL File
    • Python: Unit Testing
    • Python: Misc
    • Python: Create pip Config
    • Python IDE Installation for Eclipse
  • SQL
    • PostgreSQL
      • Postgres: Setup
      • Postgres: Vacuum
      • Postgres: Backup
      • Postgres: Restore
      • Postgres: PgAgent Installation
      • Postgres: Tables
      • Postgres: Views
      • Postgres: Functions/Triggers
      • Postgres: Check Data Type
      • Postgres: String to Rows
      • Postgres: dblink
      • Postgres: Misc
    • CouchBase
      • CouchBase Testing
    • CouchDB
      • CouchDB Testing
    • MongoDB
      • MongoDB Testing
    • Formatting and Readability
  • Kerberos
    • Kerberos Server Installation
    • Kerberos: Commands
  • Mustang
    • How I Came To Restore a 1974 Mustang II FastBack
    • 1974 Mustang II Exhaust Replacement
    • 1974 Mustang II Engine Harness Replacement
    • 1974 Mustang II Dash Wiring Repair & Update
    • 1974 Mustang II Center Console Redone
    • 1974 Mustang II Timing Cover Fix
Programming & Mustangs!

Tag: recordmetadata

Kafka & Java: Secured Producer Send Record

In this tutorial I will show you how to put a record to a secured Kafka. Before you begin you will need Maven/Eclipse all setup and a project ready to go. If you haven’t installed Kafka Kerberos yet please do so.

Import SSL Cert to Java:

Follow this tutorial to “Installing unlimited strength encryption Java libraries”

If on Windows do the following

  1. #Import it
  2. "C:\Program Files\Java\jdk1.8.0_171\bin\keytool" -import -file hadoop.csr -keystore "C:\Program Files\Java\jdk1.8.0_171\jre\lib\security\cacerts" -alias "hadoop"
  3.  
  4. #Check it
  5. "C:\Program Files\Java\jdk1.8.0_171\bin\keytool" -list -v -keystore "C:\Program Files\Java\jdk1.8.0_171\jre\lib\security\cacerts"
  6.  
  7. #If you want to delete it
  8. "C:\Program Files\Java\jdk1.8.0_171\bin\keytool" -delete -alias hadoop -keystore "C:\Program Files\Java\jdk1.8.0_171\jre\lib\security\cacerts"

POM.xml

  1. <dependency>
  2. <groupId>org.apache.kafka</groupId>
  3. <artifactId>kafka-clients</artifactId>
  4. <version>1.1.0</version>
  5. </dependency>

Imports

  1. import org.apache.kafka.clients.producer.*;
  2. import java.util.Properties;
  3. import java.io.InputStream;
  4. import java.util.Arrays;

Producer JAAS Conf (client_jaas.conf)

  1. KafkaClient {
  2. com.sun.security.auth.module.Krb5LoginModule required
  3. useTicketCache=false
  4. refreshKrb5Config=true
  5. debug=true
  6. useKeyTab=true
  7. storeKey=true
  8. keyTab="c:\\data\\kafka.service.keytab"
  9. principal="kafka/hadoop@REALM.CA";
  10. };

Producer Props File

You can go here to view all the options for producer properties.

  1. bootstrap.servers=hadoop:9094
  2. key.serializer=org.apache.kafka.common.serialization.StringSerializer
  3. value.serializer=org.apache.kafka.common.serialization.StringSerializer
  4. security.protocol=SASL_SSL
  5. sasl.kerberos.service.name=kafka

Initiate Kerberos Authentication

  1. System.setProperty("java.security.auth.login.config", "C:\\data\\kafkaconnect\\kafka\\src\\main\\resources\\client_jaas.conf");
  2. System.setProperty("https.protocols", "TLSv1,TLSv1.1,TLSv1.2");
  3. System.setProperty("java.security.krb5.conf", "C:\\Program Files\\Java\\jdk1.8.0_171\\jre\\lib\\security\\krb5.conf");
  4. System.setProperty("java.security.krb5.realm", "REALM.CA");
  5. System.setProperty("java.security.krb5.kdc", "REALM.CA");
  6. System.setProperty("sun.security.krb5.debug", "false");
  7. System.setProperty("javax.net.debug", "false");
  8. System.setProperty("javax.net.ssl.keyStorePassword", "changeit");
  9. System.setProperty("javax.net.ssl.keyStore", "C:\\Program Files\\Java\\jdk1.8.0_171\\jre\\lib\\security\\cacerts");
  10. System.setProperty("javax.net.ssl.trustStore", "C:\\Program Files\\Java\\jdk1.8.0_171\\jre\\lib\\security\\cacerts");
  11. System.setProperty("javax.net.ssl.trustStorePassword", "changeit");
  12. System.setProperty("javax.security.auth.useSubjectCredsOnly", "true");

Producer Connection/Send

The record we will send will just be a string for both key and value.

  1. Producer<String, String> producer = null;
  2.  
  3. try {
  4. ClassLoader classLoader = getClass().getClassLoader();
  5. //Get the props file and load to the producer.
  6. try (InputStream props = classLoader.getResourceAsStream("producer.props")) {
  7. Properties properties = new Properties();
  8. properties.load(props);
  9. producer = new KafkaProducer<>(properties);
  10. }
  11. //Setting up the record to send
  12. ProducerRecord<String, String> rec = new ProducerRecord<String, String>("testTopic", "Key", "Value");
  13. //Send the record and get the response
  14. RecordMetadata recMetaData = producer.send(rec).get();
  15.  
  16. //You can now print out any relavent information you want about the RecordMetaData
  17.  
  18. System.out.println("Producer Record Sent");
  19. } finally {
  20. producer.flush();
  21. producer.close();
  22. }

References

I used kafka-sample-programs as a guide for setting up props.

Author OliverPosted on July 19, 2018July 29, 2018Categories Java, BigData, KafkaTags recordmetadata, producer, keytab, KafkaProducer, kerberos, kafka, producerrecord

Kafka & Java: Unsecure Producer Send Record

In this tutorial I will show you how to put a record to Kafka. Before you begin you will need Maven/Eclipse all setup and a project ready to go.  If you haven’t installed Kafka yet please do so.

POM.xml

  1. <dependency>
  2. <groupId>org.apache.kafka</groupId>
  3. <artifactId>kafka-clients</artifactId>
  4. <version>1.1.0</version>
  5. </dependency>

Imports

  1. import org.apache.kafka.clients.producer.*;
  2. import java.util.Properties;
  3. import java.io.InputStream;
  4. import java.util.Arrays;

Producer Props File

You can go here to view all the options for producer properties.

  1. # The url to kafka
  2. bootstrap.servers=localhost:9092
  3.  
  4. # The number of acknowledgments the producer requires the leader to have received before considering a request complete
  5. acks=all
  6.  
  7. # Setting a value greater than zero will cause the client to resend any record whose send fails with a potentially transient error
  8. retries=1
  9.  
  10. # The producer will attempt to batch records together into fewer requests whenever multiple records are being sent to the same partition.
  11. batch.size=16384
  12.  
  13. # The frequency in milliseconds that the consumer offsets are auto-committed to Kafka
  14. auto.commit.interval.ms=1000
  15.  
  16. # The producer groups together any records that arrive in between request transmissions into a single batched request. Normally this occurs only under load when records arrive faster than they can be sent out
  17. linger.ms=0
  18.  
  19. # The serializer for the key
  20. key.serializer=org.apache.kafka.common.serialization.StringSerializer
  21.  
  22. # The serializer for the value
  23. value.serializer=org.apache.kafka.common.serialization.StringSerializer
  24.  
  25.  
  26. # The configuration controls the maximum amount of time the client will wait for the response of a request
  27. request.timeout.ms=60000
  28.  
  29. # If true the consumer's offset will be periodically committed in the background.
  30. enable.auto.commit=true

Producer Connection/Send

The record we will send will just be a string for both key and value.

  1. Producer<String, String> producer = null;
  2.  
  3. try {
  4. ClassLoader classLoader = getClass().getClassLoader();
  5. //Get the props file and load to the producer.
  6. try (InputStream props = classLoader.getResourceAsStream("producer.props")) {
  7. Properties properties = new Properties();
  8. properties.load(props);
  9. producer = new KafkaProducer<>(properties);
  10. }
  11. //Setting up the record to send
  12. ProducerRecord<String, String> rec = new ProducerRecord<String, String>("testTopic", "Key", "Value");
  13. //Send the record and get the response
  14. RecordMetadata recMetaData = producer.send(rec).get();
  15.  
  16. //You can now print out any relavent information you want about the RecordMetaData
  17.  
  18. System.out.println("Producer Record Sent");
  19. } finally {
  20. producer.flush();
  21. producer.close();
  22. }

References

I used kafka-sample-programs as a guide for setting up props.

 

Author OliverPosted on December 23, 2017July 20, 2018Categories BigData, KafkaTags kafka, producerrecord, recordmetadata, producer, KafkaProducer

recordmetadata

  • Kafka & Java: Secured Producer Send Record
  • Kafka & Java: Unsecure Producer Send Record
  • Contact Me
  • Publications

Recent Posts

  • Databricks: Check for Orphaned Workspace Directories

    4 months ago
  • SourceTree Installation on Windows

    8 months ago
  • Installing Jenkins Using Docker on Windows

    8 months ago
  • Oracle JDBC

    11 months ago
  • Azure: Tags

    1 year ago
  • Databricks: Python SDK

    1 year ago
  • Databricks: Get Current User

    1 year ago
  • PySpark DataFrame Methods

    1 year ago
  • Databricks Unity Catalog SQL Commands

    1 year ago
  • Databricks Unity Catalog Rest API’s

    1 year ago

Login

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org
  • AWS
    • AWS & Python
    • AWS & Node
    • AWS & Java
  • Azure
    • Azure: Python SDK
    • Azure Toolkit Installation for Eclipse
    • Azure: Install/Configure CLI
    • Azure: EventHub
    • Azure: Tags
    • Synapse
      • Synapse: Key Vault Linked Service
      • Synapse: Environment Variables
      • Synapse: List Python Packages
      • Synapse: Help Command
      • Synapse: Mounts
      • Synapse: Get Secret
      • Synapse: Notebook Exit
      • Synapse: Get KeyVault Properties Using Token Library
      • Synapse: SAS Token
  • CSS
    • CSS: FlexBox
    • CSS: Bootstrap DropDown From Text Click
    • CSS: Bootstrap Panel
    • CSS: Rotate Image
  • BigData
    • Databricks
      • Unity Catalog
        • Databricks Unity Catalog Rest API’s
        • Databricks Unity Catalog SQL Commands
      • Databricks: Python SDK
      • Databricks: Setup dbutils
      • Databricks: Notebook SQL
      • Databricks: Get Current User
      • Databricks: Check for Orphaned Workspace Directories
      • Databricks: Widgets Text
      • Databricks: Notebook Exit
      • Databricks: Get Secret
      • Databricks: Spark Check Table Exists
      • Databricks: Bearer Token CLI
      • Databricks: Mounts
      • Databricks: Notebook Commands
      • Databricks: Set Spark Configs
      • Databricks: Rest API
    • HortonWorks
    • Avro
      • Avro & Java: Record Parsing
      • Avro & Python: How to Schema, Write, Read
    • ElasticSearch
      • ElasticSearch Installation
      • ElasticSearch: Low Level Rest Client Connection
        • ElasticSearch: Low Level Client Put
        • ElasticSearch: Low Level Client Get
      • ElasticSearch: High Level Rest Client Connection
        • ElasticSearch: High Level Client Post
        • ElasticSearch: High Level Client Search
        • ElasticSearch: High Level Client Search Scrolling
    • Kafka
      • Kafka: Installation (Basic)
      • Kafka: Kerberize/SSL
      • Kafka & Java: Unsecure Producer Send Record
      • Kafka & Java: Secured Producer Send Record
      • Kafka & Java: Unsecure Consumer Read Record
      • Kafka & Java: Secured Consumer Read Record
      • Kafka & Java: Consumer List Topics
      • Kafka & Java: Consumer Seek To Beginning
    • Hadoop
      • Hadoop 2.9.1: Installation
      • Hadoop 3.2.0: Installation
      • HDFS/Yarn/MapRed: Kerberize/SSL
      • Hadoop: Rack Awareness
      • Hadoop: Add a New DataNode
      • Hadoop: Secondary NameNode
      • Hadoop: Commands
      • Hadoop & Java: Connect Remote Unsecured HDFS
      • Hadoop & Java: Connect to Remote Kerberos HDFS using KeyTab
      • Build a Java Map Reduce Application
    • HBase
      • HBase: Kerberize/SSL Installation
      • HBASE & Java: Connecting Secure
      • HBASE: Connecting Unsecure
      • HBASE & Java: Create a Table
      • HBASE & Java: Delete a Table
      • HBASE & Java: Search for Data
      • HBASE & Java: Scan Filters
      • Phoenix
        • Phoenix: Kerberize Installation
        • Phoenix & Java: Connecting Secure
        • HBASE Phoenix & Java: Unsecure Connection
    • Hive
      • Hive Kerberos Installation
      • Hive & Java: Connect to Remote Kerberos Hive using KeyTab
      • Hive: Database
      • Hive: Tables
      • Hive: Views
      • Hive: Variables
      • Hive: Map
      • Hive: Struct
      • Hive: Misc
    • Jupyter
      • Jupyter Installation
    • NiFi
      • NiFi Installation (Basic)
      • NiFi: Kerberize/SSL
      • NiFi: Custom Processor
      • NiFi: Rest API
      • NiFi: Kerberized Kafka Consumer Processor
      • NiFi: ExecuteSQL Processor
    • PIG
      • PIG: Testing
    • Sqoop2
      • Sqoop2: Installation
    • Spark
      • PySpark: StandAlone Installation on Windows
      • PySpark: Delta Lake
      • PySpark: Create a Spark Session
      • PySpark DataFrame Methods
      • PySpark: Create a DataFrame
      • PySpark: Save a DataFrame To ADLS
      • PySpark: Read From ADLS to DataFrame
      • Spark Connector Connect to SQL Server
      • PySpark: Eclipse Integration
      • Spark Installation on Hadoop
    • Zookeeper
      • Zookeeper Kerberos Installation
  • Scala
    • Scala: Basic Class Creation
    • Scala IDE Installation for Eclipse
  • Java
    • Dropwizard
      • Java: Basic Dropwizard Project
      • Dropwizard: Guice Bundle
      • Dropwizard: Command
      • Dropwizard: Resource
      • Dropwizard: Swagger Integration
    • Unit Testing
      • Java: JUnit 4 Example
      • Java: JUnit 4 /w PowerMock
    • Java: Input/Output Streams
    • Java: ExecutorService / Future
    • Java: Embed JavaScript
    • Java: Embed Python
    • Java: Enums
    • Java: Connect to Postgres
    • Java: String.format
    • Java: Command Line Arguments Parsing
    • Java: Jackson Json
    • Java: Read File From Resource
    • Java IDE Installation for Eclipse
      • Eclipse/Maven: CheckStyle Integration
      • Eclipse/Maven: Jacoco Integration
      • Eclipse/Maven: FindBugs/SpotBugs Integration
      • Eclipse/Maven: PMD Integration
  • JS
    • React, Babel + Eslint Configuration
    • JavaScript
      • Distinct Records in Object Array
      • Javascript: Variable Type
      • Javascript: Math Functions
      • Javascript: Find
      • Javascript: JSON
      • Javascript: Map
      • Javascript: Filter
      • JavaScript: Download Object Data
    • Enzyme
      • Enzyme
    • Underscore
      • JavaScript: Node & UnderscoreJs
    • Lodash
      • JavaScript: Node & Lodash
    • Leaflet
      • React: Basic Leaflet Map Example
      • React: Leaflet Modal
      • React: Leaflet EasyButton
      • React: Leaflet html Controls
      • React: Leaflet DomEvent
      • React: Leaflet LayerGroup
      • React: Leaflet Control
      • React: Leaflet Icon
      • React: Leaflet Markers
    • HighCharts
      • HighCharts: Basic Graphing
      • Highcharts: Add Custom Buttons
    • Node
      • AWS & Node
      • Simple Node WebApp
    • React
      • React Add CSS to Your Site
      • React: Page Layout
      • React: Export Module
  • .NET
    • C#: Connect to WebService
    • Dynamic Property Grid
      • PropertyGrid: DropDown Editor
      • PropertyGrid: DropDown Property
      • PropertyGrid: CheckBoxList Editor
      • PropertyGrid: CheckBoxList Property
  • OS
    • Ubuntu
      • Ubuntu: tmux
      • NGINX + UWSGI + Ubuntu 16
    • Shell: SED Command
    • Shell: Misc Commands
    • Shell: AWK Command
    • Shell: Functions
    • unRaid: Midnight Commander
    • VirtualBox
  • Python
    • Packages
      • Django
        • Django: React Website
      • Flask
        • Flask: React Website
        • Python: Flask SQLAlchemy
        • Python: Flask Resource
      • Python: MRJob
      • Python: lxml
      • Python: Arguments
      • Python: Multiprocessing Pool
      • Python: Logging
        • Python: Create a Logger
      • Python: xlrd (Read Excel File)
      • Python: pyodbc with SQL Server
    • Python: Selenium Tests
    • Python: Create in Memory Zip File
    • Python: For Loops
    • Python: Enums
    • Python: Connect To Hadoop
    • Python: Run Module as Script
    • Python: CSV from Array
    • Python: Environment Variables
    • Python: Working with DateTimes
    • Python: Get First Occurrence of Value in Array
    • Python: Run Process
    • Python: Create a WHL File
    • Python: Unit Testing
    • Python: Misc
    • Python: Create pip Config
    • Python IDE Installation for Eclipse
  • SQL
    • PostgreSQL
      • Postgres: Setup
      • Postgres: Vacuum
      • Postgres: Backup
      • Postgres: Restore
      • Postgres: PgAgent Installation
      • Postgres: Tables
      • Postgres: Views
      • Postgres: Functions/Triggers
      • Postgres: Check Data Type
      • Postgres: String to Rows
      • Postgres: dblink
      • Postgres: Misc
    • CouchBase
      • CouchBase Testing
    • CouchDB
      • CouchDB Testing
    • MongoDB
      • MongoDB Testing
    • Formatting and Readability
  • Kerberos
    • Kerberos Server Installation
    • Kerberos: Commands
  • Mustang
    • How I Came To Restore a 1974 Mustang II FastBack
    • 1974 Mustang II Exhaust Replacement
    • 1974 Mustang II Engine Harness Replacement
    • 1974 Mustang II Dash Wiring Repair & Update
    • 1974 Mustang II Center Console Redone
    • 1974 Mustang II Timing Cover Fix
Programming & Mustangs! Proudly powered by WordPress
 

Loading Comments...
 

You must be logged in to post a comment.