Quantcast
Channel: dBforums – Everything on Databases, Design, Developers and Administrators
Viewing all 13329 articles
Browse latest View live

DB Relationships

$
0
0
Hello,

I was trying to figure out how to plan a DB as per following:

I have a set of students and each one is studying some topics like maths, physics, biology, English and so on

Each group of topics belongs to a macro group ( maths, physic to group A , biology and English to group B )

For each topic a students gets a mark ( good, bad, average )

The problem is that if I have a table "students" and another table "Marks" in this last table I should have:

ID, STUDENT_ID, MATH, PHYSIC, BIOLOGY, ENGLISH
1 1 good bad good good

Now to split the topics in different areas should I build different tables ( AREA1, AREA2 ) containing the topics and linking them to students table ?

Thank you !

Looking for an easier solution

$
0
0
I have an extensive Excel report that gets populated with data every month. The data comes from user-entered records in my Access database. The Excel report is required in its current format and the powers that be will not accept an Access report with the same data.

The Excel report has some 400 rows of data and it is divided into sections (Facilities, Number of People, Miles, Minutes, Ages, etc.)

I have an Access Report (that is much prettier) that compiles the data for each of those sections where it can be reviewed easily. Once it is reviewed, it is exported to the Excel report at the click of the button.

The data is compiled by month. The dates are selected in a central report compilation form.

I will use the Facilities section as my example as it is the most extensive section of the Excel report. My module looks something like this:

Code:

strRpt = "CAMTS_AgencyMisType"

DestName = "Excel Report File Here"

Set xlApp = New Excel.Application
    With xlApp
        .visible = True
       
Set xlWB = .Workbooks.Open(DestName)
    With xlWB
        With .Sheets("2003TC")
           
            'COLUMN DETERMINATION, BASED ON DATES SELECTED IN THE CENTRAL REPORT COMPILATION FORM
                Select Case Forms!Reports!txtStart
                    Case "Jul"
                        colValue = "C"
                    Case "Aug"
                        colValue = "D"
                    Case "Sep"
                        colValue = "E"
                    Case "Oct"
                        colValue = "F"
                    Case "Nov"
                        colValue = "G"
                    Case "Dec"
                        colValue = "H"
                    Case "Jan"
                        colValue = "I"
                    Case "Feb"
                        colValue = "J"
                    Case "Mar"
                        colValue = "K"
                    Case "Apr"
                        colValue = "L"
                    Case "May"
                        colValue = "M"
                    Case "Jun"
                        colValue = "N"
                End Select
               
                'HOW I GET THE DATA FROM THE REPORT TO THE EXCEL SHEET. colValue = Month and correlates to the column
                        '40's
                        .Cells.Range(colValue & 43).Value = Reports(strRpt)!txtFirstFacility
                        .Cells.Range(colValue & 44).Value = Reports(strRpt)!txtSecondFacility
                        .Cells.Range(colValue & 45).Value = Reports(strRpt)!txtThirdFacility
                        .Cells.Range(colValue & 46).Value = Reports(strRpt)!txtFourthFacility
                        .Cells.Range(colValue & 47).Value = Reports(strRpt)!txtFifthFacility
                    etc. etc. etc about 300+ more times

The problem is, sometimes I have to add new rows to the Excel sheet to add another facility in this instance. This means that I have to go back in the module, and update the row numbers (colValue & rowNumber).

I want to be able to change the row numbers in a table rather than do so in the code. This will make it much friendlier to work with and less prone to mistakes.

What I can't figure out is how I can relate the row number in the .Cells.Range(colValue & rowNumber).Value = Reports....etc. lines to the row number on the Excel Report

Here's what I did so far, and it works, but it looks very redundant and I think there is surely a better way to do it.

Code:

strSQL(0) = "SELECT rowNumber FROM RptCardDataCells WHERE ctlName = ""SumofTotMiles"""
strSQL(1) = "SELECT rowNumber FROM RptCardDataCells WHERE ctlName = ""SumofLdMiles"""
strSQL(2) = "SELECT rowNumber FROM RptCardDataCells WHERE ctlName = ""SumoffltMinutes"""

Set db = CurrentDb
Set rs = db.OpenRecordset(strSQL(0))
    rowValue = rs!rowNumber
            .Cells.Range(colValue & rowValue).Value = Reports!(strRpt)!txtFirstFacility


Set rs = db.OpenRecordset(strSQL(1))
    rowValue = rs!rowNumber
            .Cells.Range(colValue & rowValue).Value = Reports!(strRpt)!txtSecondFacility


Set rs = db.OpenRecordset(strSQL(2))
    rowValue = rs!rowNumber
            .Cells.Range(colValue & rowValue).Value = Reports!(strRpt)!txtThirdFacility

etc. etc. 300+ times

rs.Close
Set rs = Nothing

I've also considered drawing the data from the queries themselves but I'm not sure how to do that for this scenario.

Anybody want to give it a shot?

How to query retrieve data every Thursday

$
0
0
Hi,

I want to have a weekly report with a query. I don't have any field in the table to show the week day, but I want to run the query every Thursday.

Please tell me how?

trigger function - postgresql - case when error

$
0
0
Hi all
thanks from the beginning for all your inputs

I created a function sumlast and a trigger update_actuals

I have the following tables where I insert info:
log1 - I put some time info hours
actuals - the last updated hours info is stored here, basically each machine has only one entry in table actuals
selectedmachine - from this table I select the machine I'm currently updating the info for

and one view:
time_view - that calculates the hours from figures inserted in log1.

Code:

-- Trigger: update_actuals_tg on log1

-- DROP TRIGGER update_actuals_tg on log1;

CREATE TRIGGER update_actuals_tg
  BEFORE INSERT
  ON log1
  FOR EACH ROW
  EXECUTE PROCEDURE sumlast();


Code:

-- Function: sumlast()

-- DROP FUNCTION sumlast();

CREATE OR REPLACE FUNCTION sumlast()
 RETURNS trigger AS
$BODY$begin
CASE actuals.idmachine
WHEN  selectedmachine.idmachine THEN
update actuals
set
hours = hours + (select time from time_view
where idlog = (select max(idlog) from time_view));
END CASE;
return new;
end$BODY$
  LANGUAGE plpgsql VOLATILE
  COST 100;
ALTER FUNCTION sumlast()
  OWNER TO user1;

This is the error I received when I update the log1 table. Any idea what I'm doing wrong?
the CASE WHEN conditions is creating the error. Basicly through this condition I would like to update the hours only for the idmachine that is selected


---------------------------
pgAdmin III
---------------------------
An error has occurred:

ERROR: missing FROM-clause entry for table "actuals"
LINE 1: SELECT actuals.idmachine
^
QUERY: SELECT actuals.idmachine
CONTEXT: PL/pgSQL function sumlast() line 2 at CASE
---------------------------
OK
---------------------------

Making two QUERIES as one

$
0
0
Hi everybody, I am working with SQLite and I am trying to make a single QUERY that interrogates twice the same table.
Firstly I need to select a row from a particular key "ID" creating a VIEW selecting all the values I need (val_1, val_2, val_3, val_4).
I then selected with a second QUERY all the "val" from the "test" table with the same attributes of the VIEW.
need to make a single QUERY from the two. Any idea?

Here are the two queries:

create view view_test as
Code:

select 'test'."ID",
'test'."val_1",
'test'."val_2",
'test'."val_3",
'test'."val_4"
from "test"
where 'test'."ID" = 8

Code:

select 'test'."ID",
'test'."val_1",
'test'."val_2",
'test'."val_3",
'test'."val_4"

from "test", "view_test"

where 'test'."val_1" = 'view_test'."val_1" and
'test'."val_2" = 'view_test'."val_2" and
'test'."val_3" = 'view_test'."val_3"  and
'test'."val_4" = 'view_test'."val_4



Thank you!

HELP : My database is in an inconsistant state

$
0
0
hi,

I'm Informix IDS 10.0 user. I started my infomix instance after a crashe with oninit. The database seem to be on-Line. But by doing an ontape -a, there's an error indicating the database is in an inconsistant mode.

the onstat -m indicate the message below :

listerner-thread: err = -951: oserr =13: errstr=root@dbservername: incorrect password or user root@dbservername is not on the database server.
system error = 13.

Please someone to help me?

Your thoughts on this script?

$
0
0
I think I am almost completed just missing one or two pieces I believe. If you could please just take a quick look to give your opinion I would be grateful.

Some background where I am at now with objects in Access

I have a linked table called DBO_VW_PPDREPORT off of this I built a query that is called qryPPDREPORT. This query is essentially the same as the view from SSMS with the data range choked down with a hard coded value ('2014-11')

The code is tagged below. The end game is to run the script to loop through the request and save off the filtered results into an Excel file. Pretty basic for a experts however I am not.

I did have a parameter set into the query (qryPPDREPORT) however I felt that may be problematic so I hard coded the value in the query.

Code is listed below which I found off of another Access site. Error message is run-time error 3061 two few parameters excepted two. ***I'm probably completely missing something.

HTML Code:

Private Sub Command9_Click()  'Help me :)

Dim qdf As DAO.QueryDef
Dim dbs As DAO.Database
Dim rstMgr As DAO.Recordset
Dim strSQL As String, strTemp As String, strMgr As String

' Replace PutEXCELFileNameHereWithoutdotxls with actual EXCEL
' filename without the .xls extension
' (for example, MyEXCELFileName, BUT NOT MyEXCELFileName.xls)
Const strFileName As String = "Please"
Const strQName As String = "zExportQuery"

Set dbs = CurrentDb

' Create temporary query that will be used for exporting data;
' we give it a dummy SQL statement initially (this name will
' be changed by the code to conform to each manager's identification)

strTemp = dbs.TableDefs(0).Name
strSQL = "SELECT * FROM [" & strTemp & "] WHERE 1=0;"
Set qdf = dbs.CreateQueryDef(strQName, strSQL)
qdf.Close
strTemp = strQName

' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID and EmployeesTable need to
' *** be changed to your table and field names
' Get list of ManagerID values -- note: replace my generic table and field names
' with the real names of the EmployeesTable table and the ManagerID field

strSQL = "SELECT DISTINCT Supplier FROM qryPPDREPORT;"  ' Getting error message here stating two few parameters
Set rstMgr = dbs.OpenRecordset(strSQL, dbOpenDynaset, dbReadOnly)

' Now loop through list of ManagerID values and create a query for each ManagerID
' so that the data can be exported -- the code assumes that the actual names
' of the managers are in a lookup table -- again, replace generic names with
' real names of tables and fields
If rstMgr.EOF = False And rstMgr.BOF = False Then
      rstMgr.MoveFirst
      Do While rstMgr.EOF = False
' *** code to set strMgr needs to be changed to conform to your
' *** database design -- ManagerNameField, ManagersTable, and
' *** ManagerID need to be changed to your table and field names
' *** be changed to your table and field names
            strMgr = DLookup("SUPPLIER", "qryPPDREPORT", _
                  "SUPPLIER_ID = " & rstMgr!SUPPLIER_ID.Value)
' *** code to set strSQL needs to be changed to conform to your
' *** database design -- ManagerID, EmployeesTable need to
' *** be changed to your table and field names
            strSQL = "SELECT * FROM qryPPDREPORT WHERE " & _
                  "SUPPLIER_ID = " & rstMgr!SUPPLIER_ID.Value & ";"
            Set qdf = dbs.QueryDefs(strTemp)
            qdf.Name = "q_" & strMgr
            strTemp = qdf.Name
            qdf.sql = strSQL
            qdf.Close
            Set qdf = Nothing
' Replace C:\FolderName\ with actual path
            DoCmd.TransferSpreadsheet acExport, acSpreadsheetTypeExcel9, _
                  strTemp, "C:\MFG\" & strFileName & ".xls"
            rstMgr.MoveNext
      Loop
End If

rstMgr.Close
Set rstMgr = Nothing

dbs.QueryDefs.Delete strTemp
dbs.Close
Set dbs = Nothing
End Sub

insert into, select multiple records

$
0
0
Hi

I have a main form that has four subforms.
These subforms are filtered based on the selection on main form. (via four comboboxes)
User selects a record in each of these subforms and I want to save each users selections in another table.

The below query inserts a record based on users selection in the first subform only. How do I use UNION ALL to get this to check all subform selections?

INSERT INTO dbo_TjnStudentClassProgram ( StudentID, ClassID, PSID, ProgramID )
SELECT [Forms]![FrmStudentInformation]![StudentID] AS StudentID,
[Forms]![FrmSelectClasses]![Combo0] AS ClassID,
[Forms]![FrmSelectClasses]![FrmSelectClasses Subform1].[Form]![PSID] AS PSID,
[Forms]![FrmSelectClasses]![FrmSelectClasses Subform1].[Form]![ProgramID] AS ProgramID;

Data Pipeline Crisis

$
0
0
I know some SQL. But I've been given the task of developing a data pipeline from an email account which receives a daily email with an excel file. This file has to be downloaded and then automatically be loaded to a MySQL table. I have no clue how to go about automating this. Does Talend help in some way?

Help please!

SSL Certificates in Windows 7 and Postgres

$
0
0
I am trying to implement SSL certificates with postgres 9.3 locally in Windows 7. In Windows Component Services / Local Services, postrgres is configured to start automatically, with Log On as a local system account.

Using my Windows administrator account, in a command prompt inside my data folder, when I execute postgres -D . , I get the message, "Redirecting logging output to the logging collector service." I get this error message in my log file:

Code:

2014-11-09 03:05:13 GMT LOG:  client certificates can only be checked if a root certificate store is available
2014-11-09 03:05:13 GMT HINT:  Make sure the configuration parameter "ssl_ca_file" is set.
2014-11-09 03:05:13 GMT CONTEXT:  line 2 of configuration file "D:/PostgresDat/pg_hba.conf"
2014-11-09 03:05:13 GMT FATAL:  could not load pg_hba.conf

When I try to connect in PgAdminIII I get the error message, "Server isn't listening"

What am I doing wrong? Right now, just for development purposes, do I need to have a root certificate? I tried unsuccessfully to create one with makecert but couldn't get the flags and options right.


I think I correctly followed the postgres & openssl documentation for creating the privkey.pem, server.req, server.key and server.crt files, ie.:

1. openssl genrsa –out privkey.pem 2048
2. openssl req -new -key privkey.pem -out server.req –config "D:\openssl\v9.8\openssl.cnf”
3. openssl rsa -in privkey.pem -out server.key openssl req -x509 -in server.req -text -key server.key -out server.crt -config "D:\openssl\v9.8\openssl.cnf”


This is the entire pg_hba.conf file:

Code:

# TYPE  DATABASE  USER    ADDRESS    METHOD
hostssl  all  all    127.0.0.1/32  cert  clientcert=1
hostssl  postgres  postgres  ::1/128  trust
#hostssl  all  all    ::1/128        cert  clientcert=1


I am not sure which of those last two lines in the pg_hba.conf file should I be using to require SSL certificates for all postgres accounts? Is it even possible to require a SSL certificate for the postgres account?

This the entire postgresql.conf file:

Code:

listen_addresses = '*'               
port = 5432                                # (change requires restart)
max_connections = 100                        # (change requires restart)
# - Security and Authentication -
ssl = on                                # (change requires restart)
ssl_ciphers = 'DEFAULT:!LOW:!EXP:!MD5:@STRENGTH'        # allowed SSL ciphers
ssl_renegotiation_limit = 512MB        # amount of data between renegotiations
ssl_cert_file = 'server.crt'        # (change requires restart)
ssl_key_file = 'server.key'                # (change requires restart)
#ssl_ca_file = 'root.crt'
password_encryption = on
shared_buffers = 128MB                        # min 128kB

# ERROR REPORTING AND LOGGING
# - Where to Log -
log_destination = 'stderr'
# This is used when logging to stderr:
logging_collector = on        # Enable capturing of stderr and csvlog
                                        # into log files. Required to be on for
                                        # csvlogs.
                                        # (change requires restart)
log_line_prefix = '%t '        # special values:

# - Locale and Formatting -
datestyle = 'iso, mdy'
timezone = 'US/Central'
lc_messages = 'English_United States.1252'                # locale for system error message
lc_monetary = 'English_United States.1252'                # locale for monetary formatting
lc_numeric = 'English_United States.1252'                        # locale for number formatting
lc_time = 'English_United States.1252'                        # locale for time formatting

# default configuration for text search
default_text_search_config = 'pg_catalog.english'

---------------------
I also tried changing the data folder attribute from Read Only to allow Read / Write (I was already logged in as Administrator), but the errors are the same. Anyway, Windows automatically changes the data folder attribute back to Read Only. The only Windows groups that have full permission of the data folder are SYSTEM, Administrators and my administrator /user account.

If I remove the SSL-related lines in pg_hba.conf and postgresql.conf, and use the following lines instead in pg_hba.conf, I am able to connect to the database using PgAdminIII:

host all all 127.0.0.1/32 trust
host all all ::1/128 trust

However, even then, after doing that, and setting ssl=off in postgresql.conf, when I run the command prompt and execute "postgres -D ." in the data folder, I get these errors in the command prompt console:

Code:

could not bind Ipv6 socket.  No error. Is another postmaster running on port 5432?
could not bind Ipv4 socket.  No. error.  Is another postmaster running on port 5432?
Could not create any listen sockets for "*"
Could not create any TCP / IP sockets

With that, there are no entries in the postgres log file.

Thank you in advance.

Sql server 2012 reintsalling

$
0
0
Hi All,
Do i need to uninstall Sql server 2012 completely, to run the setup again? Our Back end, is having Sql server 2005, sql server 2008 and sql server 2012. In sql server 2012 several of our live databases are running. But we are in a situation, where we need to reinstall the sql server 2012, with the same instance name. is it possible to do it without uninstalling the exisiting installation.. If i select the repair option, will it affect the live databases? Do I have to restore the database, after the re-installation?


Thanks To all In advance.

userid with no password

$
0
0
hi guys,
how do i select username on sybase with no password. Thanks

Insert files to db2 table

$
0
0
Hello,
at the start id like to say that i indeed was looking for my answer around - just couldnt find anything that works or is explained enough (mabe im just too dumb ;/)

I need help regarding loading/inserting files to db2 table - to blob column. I need to insert images, videos etc so files relatively big (not few KB but MB).
What is important i need the file to reside in database - not just pointer to file or something like that.
Currently i am using db 10.5 on Windows.

So far i came with such solution:
CALL SYSPROC.ADMIN_CMD ('import from c:\foty\import.txt of del modified by lobsinfile insert into ADMINISTRATOR."test"');
but despite it being processed correctly it got rejected (probably because file is bigger than 32K?). Ofc import.txt file has path to file which i want to be loaded.
So ive read some stuff about LOBS FROM command instead of lobsinfile but really cant get it to work ;/
Any detailes help would really really appreciated.
My table is simple, just one field with id of integer type and 2nd column "data" of blob type and size 2048

is this smart form possible?

$
0
0
There is a table named “ student_score” like the following:
Code:

ID  studentname      score          subject

12        jack          A            Biology   
13        jack          c            politics   
14        jack          B            math   
15        jack          A            physics 
16        Steve          B            math
17        Steve          A            politics

The data is recorded by a form named: “score_entery”. Is it possible that when I open the form all three fields are blank and ready for data entry, but when I put next and the form goes to the next record, the student name field automatically gets the data of the previous record? if this works I wont have to choose student name after the first record and i will only put the subject and score.
Thank you

DB2 Bulk update online database

$
0
0
Hi All,

I want to update an online table with 2 million records every hour. What is the best possible way to do it without impacting performance and avoiding any locking issues.

Can the above be done if we put 2 failover databases and update the records in one of the databases while taking the other offline? When one of the databases in this set up is taken offline whether the other one becomes primary automatically?

Thanks

SQL query in 2012 works, 2005 does not

$
0
0
I have the following query that i need help with converting it to syntax that MS SQL 2005 would understand since FORMAT among other things in it are not supported by that old version.
Code:

"SELECT " & _
    "TMP.*," & _
    "COUNT(*) OVER () AS rCount " & _
"FROM (" & _
    "SELECT venueID, " & _
        "venueName AS venueName, " & _
        "venueAddress + ', ' + venueCity + ', ' + venueState + ' ' + venueZip AS venueAddress, " & _
        "venueLatLong AS coordinates, " & _
        "FORMAT(venueEventDate, 'MM/dd/yyyy', 'en-US') + ' @ ' + CONVERT(VARCHAR,venueTime) AS dateAndTime, " & _
        "SUBSTRING(venueLatLong, 1, CHARINDEX(',', venueLatLong)-1) AS Lat, " & _
        "SUBSTRING(venueLatLong, CHARINDEX(',', venueLatLong) + 1, 1000) AS Lng, " & _
        "(round(" & _
            "3959 * acos " & _
              "(" & _
                  "cos(radians('" & center_lat & "')) " & _
                  "* cos(radians(SUBSTRING(venueLatLong, 1, CHARINDEX(',', venueLatLong)-1))) " & _
                  "* cos(radians(SUBSTRING(venueLatLong, CHARINDEX(',', venueLatLong) + 1, 1000)) " & _
                  "- radians('" & center_lng & "')) " & _
                  "+ sin(radians('" & center_lat & "')) " & _
                  "* sin(radians(SUBSTRING(venueLatLong, 1, CHARINDEX(',', venueLatLong)-1)))" & _
              ")" & _
        ", 1, 1)) AS distance " & _
        "FROM meetUpMarkers) " & _
    "TMP " & _
"WHERE distance < " & radius & " " & _
"ORDER BY venueName,distance DESC;"

I tried to replace FORMAT with CONVERT but it still seems to be incorrect.

When i change FORMAT to CONVERT i get the error:

Type venueEventDate is not a defined system type.

Would appreciate the help.

Why does a WHERE clause on a nullable field not return the null records?

$
0
0
I recently ran into an issue with an issue with a query against our Data Warehouse. When attempting to sum revenue from a table, and using a WHERE clause on a field that contains NULL values, the records with the NULL values are suppressed (in addition to whatever the WHERE clause specified). I believe this is because a NULL value is unknown so SQL doesn't know if it does or doesn't fit the criteria of there WHERE clause so it is suppressed (correct me if i am wrong).

That being said, is there a way to avoid this instead of having to add an ISNULL function in the WHERE clause which is going to kill performance?

Code:

create table #nullTest (
name varchar(50)
,revenue int)

INSERT INTO #nullTest
Values ('Tim',100)
,('Andrew', 50)
,(null, 200)

SELECT sum(revenue) as Revenue FROM #nulltest WHERE name <> 'tim'

Ideally, I would want the SELECT statement above to return 250, not 50. The only way I can think to accomplish this is with this query:
Code:

SELECT sum(revenue) as Revenue FROM #nullTest WHERE isnull(name,'') <> 'tim'

How to get week number from db2 using visual studio 2013

$
0
0
Hello,

I am using db2 with a date filed formatted as 20141110, which is November 10, 2014 in my sql query using Visual Studio 2013 I am trying to wonder how do I get the week number from the date field, which should be week 46. Any tips will be appreciated. Thank you in advance.

DENSE_RANK Vs ROW_NUMBER

$
0
0
The dense_rank and row_number functions have any difference on the performance of a sql?

Not a legal OLEAUT date when importing from ACCESS to SQL

$
0
0
Im using the migration assistant but I keep running into this error that only allows 92% of the a certain table to be migrated. I changed the target field from datetime2 to just datetime but I keep getting very similar errors.

http://www.sqlservercentral.com/Foru...ment16483.aspx

and When I use SQL 2014's built import export wizard, I get these errors.

Validating (Error)
Messages
Error 0xc0202049: Data Flow Task 1: Failure inserting into the read-only column "Order_Header_ID".
(SQL Server Import and Export Wizard)

Error 0xc0202045: Data Flow Task 1: Column metadata validation failed.
(SQL Server Import and Export Wizard)

Error 0xc004706b: Data Flow Task 1: "Destination - Order_Header" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)

Error 0xc004700c: Data Flow Task 1: One or more component failed validation.
(SQL Server Import and Export Wizard)

Error 0xc0024107: Data Flow Task 1: There were errors during task validation.
(SQL Server Import and Export Wizard)
8.png
Attached Images
Viewing all 13329 articles
Browse latest View live